muHVT: Collection of functions used for vector quantization and construction of hierarchical voronoi tessellations for data analysis

Zubin Dowlaty, Shubhra Prakash, Sangeet Moy Das, Praditi Shah, Shantanu Vaidya, Somya Shambhawi

2023-06-05

1 Abstract

The muHVT package is a collection of R functions to facilitate building topology preserving maps for rich multivariate data. Tending towards a big data preponderance, a large number of rows. A collection of R functions for this typical workflow is organized below :

  1. Data Compression: Vector quantization (VQ), HVQ (hierarchical vector quantization) using means or medians. This step compresses the rows (long data frame) using a compression objective

  2. Data Projection: Dimension projection of the compressed cells to 1D,2D and 3D with the Sammons Non-linear Algorithm. This step creates topology preserving map (also called as embedding) coordinates into the desired output dimension .

  3. Tessellation: Create cells required for object visualization using the Voronoi Tessellation method, package includes heatmap plots for hierarchical Voronoi tessellations (HVT). This step enables data insights, visualization, and interaction with the topology preserving map. Useful for semi-supervised tasks

  4. Prediction: Scoring new data sets and recording their assignment using the map objects from the above steps, in a sequence of maps if required

2 Compress: Vector Quantization

Compression is a technique used to reduce the data size while preserving its essential information, allowing for efficient storage and decompression to reconstruct the original data. While Vector quantization (VQ) is a technique used in data compression to represent a set of data points with a smaller number of representative vectors. It achieves compression by exploiting redundancies or patterns in the data and replacing similar data points with representative vectors.

This package offers several advantages for performing data compression as it is designed to handle high-dimensional data more efficiently. It provides a hierarchical compression approach, allowing multi-resolution representation of the data. The hierarchical structure enables efficient compression and storage of the data while preserving different levels of detail. HVT aims to preserve the topological structure of the data during compression.Spatial data with irregular shapes and complex structures in high-dimensional data can contain valuable information about relationships and patterns. HVT seeks to capture and retain these topological characteristics, enabling meaningful analysis and visualization.This package employs tessellation to divide the compressed data space into distinct cells or regions while preserving the topology of the original data. This means that the relationships and connectivity between data points are maintained in the compressed representation.

This package can perform vector quantization using the following algorithms -

2.1 Hierarchical Vector Quantization

2.1.1 Using k-means

  1. The k-means algorithm randomly selects k data points as initial means
  2. k clusters are formed by assigning each data point to its closest cluster mean using the Euclidean distance
  3. Virtual means for each cluster are calculated by using all datapoints contained in a cluster

The second and third steps are iterated until a predefined number of iterations is reached or the clusters converge. The runtime for the algorithm is O(n).

2.1.2 Using k-medoids

  1. The k-medoids algorithm randomly selects k data points as initial means out of the n data points as the medoids.
  2. k clusters are formed by assigning each data point to its closest medoid by using any common distance metric methods.
  3. Virtual means for each cluster are calculated by using all datapoints contained in a cluster

The second and third steps are iterated until a predefined number of iterations is reached or the clusters converge. The runtime for the algorithm is O(k * (n-k)^2) .

These algorithm divides the dataset recursively into cells using \(k-means\) or \(k-medoids\) algorithm. The maximum number of subsets are decided by setting \(n_cells\) to, say five, in order to divide the dataset into maximum of five subsets. These five subsets are further divided into five subsets(or less), resulting in a total of twenty five (5*5) subsets. The recursion terminates when the cells either contain less than three data point or a stop criterion is reached. In this case, the stop criterion is set to when the cell error exceeds the quantization threshold.

The steps for this method are as follows :

  1. Select k(number of cells), depth and quantization error threshold
  2. Perform quantization (using \(k-means\) or \(k-medoids\)) on the input dataset
  3. Calculate quantization error for each of the k cells
  4. Compare the quantization error for each cell to quantization error threshold
  5. Repeat steps 2 to 4 for each of the k cells whose quantization error is above threshold until stop criterion is reached.

The stop criterion is when the quantization error of a cell satisfies one of the below conditions

  • reaches below quantization error threshold
  • there are less than three data points in the cell
  • the user specified depth has been attained

The quantization error for a cell is defined as follows :

\[QE = \max_i(||A-F_i||_{p})\]

where

  • \(A\) is the centroid of the cell
  • \(F_i\) represents a data point in the cell
  • \(m\) is the number of points in the cell
  • \(p\) is the \(p\)-norm metric. Here \(p\) = 1 represents L1 Norm and \(p\) = 2 represents L2 Norm.

2.1.3 Quantization Error

Let us try to understand quantization error with an example.

Figure 1: The Voronoi tessellation for level 1 shown for the 5 cells with the points overlayed

Figure 1: The Voronoi tessellation for level 1 shown for the 5 cells with the points overlayed

An example of a 2 dimensional VQ is shown above.

In the above image, we can see 5 cells with each cell containing a certain number of points. The centroid for each cell is shown in blue. These centroids are also known as codewords since they represent all the points in that cell. The set of all codewords is called a codebook.

Now we want to calculate quantization error for each cell. For the sake of simplicity, let’s consider only one cell having centroid A and m data points \(F_i\) for calculating quantization error.

For each point, we calculate the distance between the point and the centroid.

\[ d = ||A - F_i||_{p} \]

In the above equation, p = 1 means L1_Norm distance whereas p = 2 means L2_Norm distance. In the package, the L1_Norm distance is chosen by default. The user can pass either L1_Norm, L2_Norm or a custom function to calculate the distance between two points in n dimensions.

\[QE = \max_i(||A-F_i||_{p})\]

Now, we take the maximum calculated distance of all m points. This gives us the furthest distance of a point in the cell from the centroid, which we refer to as Quantization Error. If the Quantization Error is higher than the given threshold, the centroid/ codevector is not a good representation for the points in the cell. Now we can perform further Vector Quantization on these points and repeat the above steps.

Please note that the user can select mean, max or any custom function to calculate the Quantization Error. The custom function takes a vector of m value (where each value is a distance between point in n dimensions and centroids) and returns a single value which is the Quantization Error for the cell.

If we select mean as the error metric, the above Quantization Error equation will look like this :

\[QE = \frac{1}{m}\sum_{i=1}^m||A-F_i||_{p}\]

3 Projection

Projection mainly involves converting data from its original form to a different space or coordinate system while preserving certain properties of it. By projecting data into a common coordinate system, spatial relationships, distances, areas, and other spatial attributes can be accurately measured and compared.

HVT performs projection as part of its workflow to visualize and explore high-dimensional data. The projection step in HVT involves mapping the compressed data, represented by the hierarchical structure of cells, onto a lower-dimensional space for visualization purposes, as human perception is more suited to interpreting information in lower-dimensional spaces.Users can zoom in/out, rotate, and explore different regions of the projected space to gain insights and understand the data from different perspectives.

Sammon’s projection is an algorithm used in this package to map a high-dimensional space to a space of lower dimensionality while attempting to preserve the structure of inter-point distances in the projection. It is particularly suited for use in exploratory data analysis and is usually considered a non-linear approach since the mapping cannot be represented as a linear combination of the original variables. The centroids are plotted in 2D after performing Sammon’s projection at every level of the tessellation.

Denoting the distance between \(i^{th}\) and \(j^{th}\) objects in the original space by \(d_{ij}^*\), and the distance between their projections by \(d_{ij}\). Sammon’s mapping aims to minimize the below error function, which is often referred to as Sammon’s stress or Sammon’s error

\[E=\frac{1}{\sum_{i<j} d_{ij}^*}\sum_{i<j}\frac{(d_{ij}^*-d_{ij})^2}{d_{ij}^*}\]

The minimization of this can be performed either by gradient descent, as proposed initially, or by other means, usually involving iterative methods. The number of iterations need to be experimentally determined and convergent solutions are not always guaranteed. Many implementations prefer to use the first Principal Components as a starting configuration.

4 Voronoi Tessellations

A Voronoi diagram is a way of dividing space into a number of regions. A set of points (called seeds, sites, or generators) is specified beforehand and for each seed, there will be a corresponding region consisting of all points within proximity of that seed. These regions are called Voronoi cells. It is complementary to Delaunay triangulation is a geometrical algorithm used to create a triangulated mesh from a set of points in a plane which has the property that no data point lies within the circumcircle of any triangle in the triangulation. This property guarantees that the resulting cells in the tessellation do not overlap with each other.

By using Delaunay triangulation, HVT can achieve a partitioning of the data space into distinct and non-overlapping regions, which is crucial for accurately representing and analyzing the compressed data.Additionally, the use of Delaunay triangulation for tessellation ensures that the resulting cells have well-defined shapes, typically triangles in two dimensions or tetrahedra in three dimensions.

The hierarchical structure resulting from tessellation preserves the inherent structure and relationships within the data. It captures clusters, subclusters, and other patterns in the data, allowing for a more organized and interpretable representation. The hierarchical structure reduces redundancy and enables more compact representations

Tessellate: Constructing Voronoi Tesselations

In this package, we use sammons from the package MASS to project higher dimensional data to a 2D space. The function hvq called from the HVT function returns hierarchical quantized data which will be the input for construction of the tessellations. The data is then represented in 2D coordinates and the tessellations are plotted using these coordinates as centroids. We use the package deldir for this purpose. The deldir package computes the Delaunay triangulation (and hence the Dirichlet or Voronoi tessellation) of a planar point set according to the second (iterative) algorithm of Lee and Schacter. For subsequent levels, transformation is performed on the 2D coordinates to get all the points within its parent tile. Tessellations are plotted using these transformed points as centroids. The lines in the tessellations are chopped in places so that they do not protrude outside the parent polygon. This is done for all the subsequent levels.

5 Prediction

Prediction basically refers to the process of making predictions or estimating future values or outcomes based on existing data patterns.In data prediction, a model is developed based on historical data or a training dataset, and this model is then used to make predictions on new, unseen data. The model captures the underlying patterns, trends, and relationships present in the training data, allowing it to make informed predictions on similar or related data points.

In this package, we use predictHVT function to predict each point in the test dataset

Prediction Algorithm

The prediction algorithm recursively calculates the distance between each point in the test dataset and the cell centroids for each level. The following steps explain the prediction method for a single point in the test dataset :

  1. Calculate the distance between the point and the centroid of all the cells in the first level
  2. Find the cell whose centroid has minimum distance to the point
  3. Check if the cell drills down further to form more cells
  4. If it doesn’t, return the path. Or else repeat steps 1 to 4 till we reach a level at which the cell doesn’t drill down further

6 Example Usage : Visualizing Multidimensional Data with Sammon’s Projection using Torus (Donut)

In this section, we will see how we can use the package to visualize multidimensional data by projecting them to two dimensions using Sammon’s projection

Data Understanding

First of all, let us see how to generate data for torus. We are using a library geozoo for this purpose. Geo Zoo (stands for Geometric Zoo) is a compilation of geometric objects ranging from three to 10 dimensions. Geo Zoo contains regular or well-known objects, eg cube and sphere, and some abstract objects, e.g. Boy’s surface, Torus and Hyper-Torus.

Here, we will generate a 3D torus (a torus is a surface of revolution generated by revolving a circle in three-dimensional space one full revolution about an axis that is coplanar with the circle) with 9000 points.

Let us first split the torus data into train and test. We will use 9000 data points as train and remaining 3000 data points as test.

set.seed(240)
# Here p represents dimension of object
# n represents number of points
torus <- geozoo::torus(p = 3,n = 12000)
torus_df <- data.frame(torus$points)
torus_df1 <- torus_df %>% round(4)
colnames(torus_df) <- c("x","y","z")
noOfPoints <- dim(torus_df)[1]
trainTorus <- torus_df[1:9000,]
trainTorus1 <- trainTorus %>% round(4)
testTorus <- torus_df[9001:noOfPoints,]

Now let’s do some EDA on the data. First of all, we will see how the training data looks like. For the shake of brevity we are displaying first six rows.


Table(head(trainTorus))
x y z
-2.628238 0.5655770 -0.7253285
-1.417917 -0.8902793 0.9454533
-1.030820 1.1066495 -0.8730506
1.884711 0.1894905 0.9943888
-1.950608 -2.2506838 0.2070521
-1.482371 0.9228529 0.9672467

Now let’s have a look at structure and summary of the data.

str(torus_df)
#> 'data.frame':    9000 obs. of  3 variables:
#>  $ x: num  -2.63 -1.42 -1.03 1.88 -1.95 ...
#>  $ y: num  0.566 -0.89 1.107 0.189 -2.251 ...
#>  $ z: num  -0.725 0.945 -0.873 0.994 0.207 ...
summary(torus_df)
#>        x                  y                   z             
#>  Min.   :-2.99767   Min.   :-2.999343   Min.   :-0.9999999  
#>  1st Qu.:-1.15065   1st Qu.:-1.120632   1st Qu.:-0.7130951  
#>  Median :-0.01899   Median : 0.001856   Median : 0.0033675  
#>  Mean   :-0.00914   Mean   : 0.004195   Mean   : 0.0001237  
#>  3rd Qu.: 1.13001   3rd Qu.: 1.130708   3rd Qu.: 0.7138584  
#>  Max.   : 2.99713   Max.   : 2.999308   Max.   : 1.0000000

Now let’s try to visualize the torus (donut) in 3D Space.


plot_torus <- plotly::plot_ly(torus_df, x= ~x, y= ~y, z = ~z, color = ~z) %>% add_markers()
plot_torus

Figure 2: 3D Torus

Note: The steps of compression, projection, and tessellation are iteratively performed until a minimum compression rate of 80% is achieved. Once the desired compression is attained, the resulting model object is used for scoring using the predictHVT() function

In this section all the outlined workflow steps provided in the abstract section (Compression, Projection, Tessellation and Prediction) are executed at level 1.

6.0.1 Step 1: Data Compression

The core function for compression in the workflow is HVQ, which is called within the HVT function. we have a parameter called quantization error. This parameter acts as a threshold and determines the number of levels in the hierarchy. It means that, if there are ‘n’ number of levels in the hierarchy, then all the clusters formed till this level will have quantization error equal or greater than the threshold quantization error. The user can define the number of clusters in the first level of hierarchy and then each cluster in first level is sub-divided into the same number of clusters as there are in the first level. This process continues and each group is divided into smaller clusters as long as thethreshold quantization error is met. The output of this technique will be hierarchically arranged vector quantized data.

However, let’s try to comprehend the HVT function first before moving on.

HVT(
  dataset,
  min_compression_perc,
  n_cells,
  depth,
  quant.err,
  projection.scale,
  normalize = T,
  distance_metric = c("L1_Norm", "L2_Norm"),
  error_metric = c("mean", "max"),
  quant_method = c("kmeans", "kmedoids"),
  diagnose = TRUE,
  hvt_validation = FALSE,
  train_validation_split_ratio = 0.8
)

Each of the parameters of HVT function have been explained below :

We will use the HVT function to compress our data while preserving essential features of the dataset. Our goal is to achieve data compression upto atleast 80%. In situations where the compression ratio does not meet the desired target, we can explore adjusting the model parameters as a potential solution. This involves making modifications to parameters such as the quantization error threshold or increasing the number of cells and then rerunning the HVT function again.

In our example we will iteratively increase the number of cells until the desired compression percentage is reached instead of increasing the quantization threshold because it may reduce the level of detail captured in the data representation

We will pass the below mentioned model parameters along with torus dataset to HVT function.

Model Parameters

set.seed(240)
hvt.torus <- muHVT::HVT(
  torus_df,
  n_cells = 100,
  depth = 1,
  quant.err = 0.1,
  projection.scale = 10,
  normalize = T,
  distance_metric = "L1_Norm",
  error_metric = "max",
  quant_method = "kmeans"
)

Let’s checkout the compression summary .

compressionSummaryTable(hvt.torus[[3]]$compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 100 0 0 n_cells: 100 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

As it can be seen from the table above, none of the 100 cells have reached the quantization threshold error. Therefore we can further subdivide the cells by increasing the n_cells parameters and then see if desired compression (80%) is reached

6.0.2 Step 2: Data Projection

The function sammonsProjection() utilizes the sammons function from the MASS package being called in HVT. Sammon’s projection is an algorithm that maps a high-dimensional space to a space of lower dimensionality while attempting to preserve the structure of inter-point distances in the projection.The centroids are plotted in 2D after performing Sammon’s projection at every level of the tessellation.

lets view the projected 2D centroids after performing sammon’s projection. For the shake of brevity we are displaying first six rows.


hvt_torus_coordinates <-hvt.torus[[2]][[1]][["1"]]
centroids <<- list()
  coordinates_value <- lapply(1:length(hvt_torus_coordinates), function(x){
    centroids <-hvt_torus_coordinates[[x]]
    coordinates <- centroids$pt
  })
centroid_coordinates<<- do.call(rbind.data.frame, coordinates_value)  
colnames(centroid_coordinates) <- c("x","y")
centroid_coordinates <- centroid_coordinates %>% data.frame() %>% round(4)
Table(head(centroid_coordinates), scroll = T, limit = 20)
x y
21.0149 -3.5616
-8.4949 13.0874
-1.7701 -5.6721
-2.4725 -16.9351
-3.8628 21.0911
-14.0413 -11.1851

6.0.3 Step 3: Tessellation

The deldir package computes the Delaunay triangulation (and hence the Dirichlet or Voronoi tessellation) of a planar point set according to the second (iterative) algorithm of Lee and Schacter. For subsequent levels, transformation is performed on the 2D coordinates to get all the points within its parent tile. Tessellations are plotted using these transformed points as centroids.

plotHVT is the main function to plot hierarchical voronoi tessellations.

Now let’s try to understand plotHVT function. The parameters have been explained in detail below

plotHVT(hvt.results, line.width, color.vec, pch1 = 21, centroid.size = 3, title = NULL, maxDepth = 1)

For better visualisation, let’s plot the Voronoi tessellation.

muHVT::plotHVT(
  hvt.torus,
  line.width = c(0.4),
  color.vec = c("#141B41"),
  centroid.size = 0.6,
  maxDepth = 1
)
Figure 3: The Voronoi tessellation for layer 1 shown for the 100 cells in the dataset ’torus’

Figure 3: The Voronoi tessellation for layer 1 shown for the 100 cells in the dataset ’torus’

Since, we are yet to achive atleast 80% compression. Let’s try to compress again using the below mentioned set of model parameters.

Step 1: Data Compression

For more detailed information on Data Compression please refer to section 2 of this vignette.

Model Parameters

set.seed(240)
hvt.torus2 <- muHVT::HVT(
  torus_df,
  n_cells = 300,
  depth = 1,
  quant.err = 0.1,
  projection.scale = 10,
  normalize = T,
  distance_metric = "L1_Norm",
  error_metric = "max",
  quant_method = "kmeans"
)

Let’s checkout the compression summary again.

compressionSummaryTable(hvt.torus2[[3]]$compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 300 43 0.14 n_cells: 300 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

It can be observed from the table above that only 43 cells out of 300 i.e. 14% of the cells reached the Quantization Error threshold. Therefore we can further subdivide the cells by increasing the n_cells parameters and then see if 80% compression is reached

Step 2: Data Projection

For more detailed information on Data Projection please refer to section 3 of this vignette.

Now, lets view the projected 2D centroids after performing sammon’s projection. For the shake of brevity we are displaying first six rows.


hvt_torus_coordinates <-hvt.torus2[[2]][[1]][["1"]]
centroids <<- list()
  coordinates_value <- lapply(1:length(hvt_torus_coordinates), function(x){
    centroids <-hvt_torus_coordinates[[x]]
    coordinates <- centroids$pt
  })
centroid_coordinates<<- do.call(rbind.data.frame, coordinates_value)  
colnames(centroid_coordinates) <- c("x","y")
centroid_coordinates <- centroid_coordinates %>% data.frame() %>% round(4)
Table(head(centroid_coordinates), scroll = T, limit = 20)
x y
22.0935 0.9400
-8.4087 11.4999
-0.0049 -5.9984
1.9846 -14.4026
-0.9439 22.0075
-6.5948 -13.0226

Step 3: Tessellation

For more detailed information on Data Compression please refer to section 4 of this vignette.

For better visualisation, let’s plot the Voronoi tessellation.

muHVT::plotHVT(
  hvt.torus2,
  line.width = c(0.4),
  color.vec = c("#141B41"),
  centroid.size = 0.6,
  maxDepth = 1
)
Figure 4: The Voronoi tessellation for layer 1 shown for the 300 cells in the dataset ’torus’

Figure 4: The Voronoi tessellation for layer 1 shown for the 300 cells in the dataset ’torus’

Now let’s try again with the below mentioned set of model parameters and see whether we achieve a compression rate of up to 80%:

Step 1: Data Compression

Model Parameters

set.seed(240)
hvt.torus3 <- muHVT::HVT(
  torus_df,
  n_cells = 900,
  depth = 1,
  quant.err = 0.1,
  projection.scale = 10,
  normalize = T,
  distance_metric = "L1_Norm",
  error_metric = "max",
  quant_method = "kmeans"
)

Let’s check the compression summary for torus.

compressionSummaryTable(hvt.torus3[[3]]$compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 900 839 0.93 n_cells: 900 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

By increasing the number of cells to 900, we were successfully able to compress 93% of the data, so we will not further subdivide the cells

Step 2: Data Projection

Now, lets view the projected 2D centroids after performing sammon’s projection. For the shake of brevity we are displaying first six rows.


hvt_torus_coordinates <-hvt.torus3[[2]][[1]][["1"]]
centroids <<- list()
  coordinates_value <- lapply(1:length(hvt_torus_coordinates), function(x){
    centroids <-hvt_torus_coordinates[[x]]
    coordinates <- centroids$pt
  })
centroid_coordinates<<- do.call(rbind.data.frame, coordinates_value)  
colnames(centroid_coordinates) <- c("x","y")
centroid_coordinates <- centroid_coordinates %>% data.frame() %>% round(4)
Table(head(centroid_coordinates), scroll = T, limit = 20)
x y
20.7876 8.8385
-15.0650 6.5840
4.1539 -7.6615
9.7266 -12.0612
-10.5474 18.8626
-6.0931 -12.1636

Step 3: Tessellation

For better visualisation, let’s plot the Voronoi tessellation.

muHVT::plotHVT(
  hvt.torus3,
  line.width = c(0.4),
  color.vec = c("#141B41"),
  centroid.size = 0.6,
  maxDepth = 1
)
Figure 5: The Voronoi tessellation for layer 1 shown for the 900 cells in the dataset ’torus’

Figure 5: The Voronoi tessellation for layer 1 shown for the 900 cells in the dataset ’torus’

From the presented plot, the inherent structure of the donut can be easily observed in the two-dimensional space

We will now overlay all the features as heatmap over the Voronoi Tessellation plot for better visualization and identification of patterns, trends, and variations in the data. .

Let’s have look at the hvtHmap function which we will use to overlay a variable as heatmap.

hvtHmap(hvt.results, dataset, child.level, hmap.cols, color.vec ,line.width, palette.color = 6)

Now let’s plot the Voronoi Tessellation with the heatmap overlaid for all the features in the torus data for better visualization and interpretation of data patterns and distributions.

The heatmaps displayed below provides a visual representation of the spatial characteristics of the torus, allowing us to observe patterns and trends in the distribution of each of the features (X,Y and Z). The sheer green shades highlight regions with higher coordinate values in each of the heatmaps, while the indigo shades indicate areas with the lowest coordinate values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the torus structure

muHVT::hvtHmap(
  hvt.torus3,
  torus_df,
  child.level = 1,
  hmap.cols = "x",
  line.width = c(0.4),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.8,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 6: The Voronoi tessellation for layer 1 and number of cells 900 with the heat map overlaid for variable x in the ’torus’ dataset

Figure 6: The Voronoi tessellation for layer 1 and number of cells 900 with the heat map overlaid for variable x in the ’torus’ dataset

muHVT::hvtHmap(
  hvt.torus3,
  torus_df,
  child.level = 1,
  hmap.cols = "y",
  line.width = c(0.4),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.8,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 7: The Voronoi tessellation for layer 1 and number of cells 900 with the heat map overlaid for variable y in the ’torus’ dataset

Figure 7: The Voronoi tessellation for layer 1 and number of cells 900 with the heat map overlaid for variable y in the ’torus’ dataset

muHVT::hvtHmap(
  hvt.torus3,
  torus_df,
  child.level = 1,
  hmap.cols = "z",
  line.width = c(0.4),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.8,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 8: The Voronoi tessellation for layer 1 and number of cells 900 with the heat map overlaid for variable z in the ’torus’ dataset

Figure 8: The Voronoi tessellation for layer 1 and number of cells 900 with the heat map overlaid for variable z in the ’torus’ dataset

6.0.4 Step 4: Prediction(predictHVT)

Lets have a look at out test dataset before we pass it to predictHVT function for scoring

Table(head(testTorus))
x y z
9001 1.9070862 0.9435865 -0.9918060
9002 -0.1071257 1.6316421 0.9310683
9003 -0.4991939 1.6573103 -0.9631007
9004 -2.8809895 0.2508846 -0.4522470
9005 -1.1597959 0.2173767 0.5723509
9006 -0.9391760 0.8623292 -0.6887646

However, let’s try to comprehend the predictHVT function first before moving on

predictHVT(data,
                  hvt.results,
                  hmap.cols = NULL,
                  child.level = 1,
                  ...)

The important parameters for the function predictHVT are as below

Now once we have built the model, let us try to predict using our test dataset which cell and which level each point belongs to.

set.seed(240)
predictions_torus <- muHVT::predictHVT(
  testTorus,
  hvt.torus3,
  hmap.cols = "Quant.Error",
  child.level = 1,
  line.width = c(1.2),
  color.vec = c("#141B41"),
  quant.error.hmap = 0.1,
  n_cells.hmap = 9000
)

Let’s see which cell and level each point belongs to. For the sake of brevity, we will only show the first 10 rows


predictions_torus[["scoredPredictedData"]] %>% head(100) %>% 
  round(2) %>%
  as.data.frame() %>%
  Table(scroll = T, limit = 10)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error x y z centroidRadius diff anomalyFlag
1 1 533 1 63 0.04 1.28 0.63 -1.39 0.05 0.01 0
1 1 435 1 764 0.04 -0.07 1.09 1.31 0.03 -0.01 0
1 1 838 1 209 0.04 -0.33 1.11 -1.35 0.04 0.00 0
1 1 563 1 378 0.07 -1.91 0.17 -0.64 0.08 0.01 0
1 1 99 1 634 0.01 -0.77 0.14 0.80 0.05 0.04 0
1 1 166 1 351 0.05 -0.62 0.58 -0.97 0.07 0.02 0
1 1 39 1 315 0.00 -0.12 -0.75 -0.71 0.05 0.05 0
1 1 864 1 660 0.06 -0.79 -0.28 0.95 0.08 0.02 0
1 1 518 1 372 0.04 -0.74 -0.05 -0.69 0.05 0.00 0
1 1 22 1 539 0.10 1.70 1.04 0.33 0.08 -0.01 0

7 Example Usage: Predictions using the predictHVT on Personal Computers Dataset.

Data Understanding

In this section, we will use the Prices of Personal Computers dataset. This dataset contains 6259 observations and 10 features. The dataset observes the price from 1993 to 1995 of 486 personal computers in the US. The variables are price, speed, ram, screen, cd, etc. The dataset can be downloaded from here.

In this example, we will compress this dataset by using hierarchical VQ via k-means and visualize the Voronoi Tessellation plots using Sammons projection. Later on, we will overlay all the variables as a heatmap to generate further insights.

Here, we load the data and store into a variable computers.

set.seed(240)
# Load data from csv files
computers <- read.csv("https://raw.githubusercontent.com/Mu-Sigma/muHVT/master/vignettes/sample_dataset/Computers.csv")

Let’s explore the Personal Computers Dataset. For the shake of brevity we are displaying first six rows.

# Quick peek

Table(head(computers), scroll = T, limit = 20)
X price speed hd ram screen cd multi premium ads trend
1 1499 25 80 4 14 no no yes 94 1
2 1795 33 85 2 14 no no yes 94 1
3 1595 25 170 4 15 no no yes 94 1
4 1849 25 170 8 14 no no no 94 1
5 3295 33 340 16 14 no no yes 94 1
6 3695 66 340 16 14 no no yes 94 1

Now, let us check the structure of the data and analyse its summary.

str(computers)
#> 'data.frame':    6259 obs. of  11 variables:
#>  $ X      : int  1 2 3 4 5 6 7 8 9 10 ...
#>  $ price  : int  1499 1795 1595 1849 3295 3695 1720 1995 2225 2575 ...
#>  $ speed  : int  25 33 25 25 33 66 25 50 50 50 ...
#>  $ hd     : int  80 85 170 170 340 340 170 85 210 210 ...
#>  $ ram    : int  4 2 4 8 16 16 4 2 8 4 ...
#>  $ screen : int  14 14 15 14 14 14 14 14 14 15 ...
#>  $ cd     : chr  "no" "no" "no" "no" ...
#>  $ multi  : chr  "no" "no" "no" "no" ...
#>  $ premium: chr  "yes" "yes" "yes" "no" ...
#>  $ ads    : int  94 94 94 94 94 94 94 94 94 94 ...
#>  $ trend  : int  1 1 1 1 1 1 1 1 1 1 ...
summary(computers)
#>        X            price          speed              hd        
#>  Min.   :   1   Min.   : 949   Min.   : 25.00   Min.   :  80.0  
#>  1st Qu.:1566   1st Qu.:1794   1st Qu.: 33.00   1st Qu.: 214.0  
#>  Median :3130   Median :2144   Median : 50.00   Median : 340.0  
#>  Mean   :3130   Mean   :2220   Mean   : 52.01   Mean   : 416.6  
#>  3rd Qu.:4694   3rd Qu.:2595   3rd Qu.: 66.00   3rd Qu.: 528.0  
#>  Max.   :6259   Max.   :5399   Max.   :100.00   Max.   :2100.0  
#>       ram             screen           cd               multi          
#>  Min.   : 2.000   Min.   :14.00   Length:6259        Length:6259       
#>  1st Qu.: 4.000   1st Qu.:14.00   Class :character   Class :character  
#>  Median : 8.000   Median :14.00   Mode  :character   Mode  :character  
#>  Mean   : 8.287   Mean   :14.61                                        
#>  3rd Qu.: 8.000   3rd Qu.:15.00                                        
#>  Max.   :32.000   Max.   :17.00                                        
#>    premium               ads            trend      
#>  Length:6259        Min.   : 39.0   Min.   : 1.00  
#>  Class :character   1st Qu.:162.5   1st Qu.:10.00  
#>  Mode  :character   Median :246.0   Median :16.00  
#>                     Mean   :221.3   Mean   :15.93  
#>                     3rd Qu.:275.0   3rd Qu.:21.50  
#>                     Max.   :339.0   Max.   :35.00

Let us first split the data into train and test. We will use 80% of the data as train and remaining as test.

noOfPoints <- dim(computers)[1]
trainLength <- as.integer(noOfPoints * 0.8)
trainComputers <- computers[1:trainLength,]
testComputers <- computers[(trainLength+1):noOfPoints,]

K-means is not suitable for factor variables as the sample space for factor variables is discrete. A Euclidean distance function on such a space isn’t really meaningful. Hence, we will delete the factor variables(X, cd, multi, premium, trend) in our dataset.

Here we keep the original trainComputers and testComputers as we will use the variables from this dataset to overlay as heatmap and generate some insights.

trainComputers <-
  trainComputers %>% dplyr::select(-c(X, cd, multi, premium, trend))
testComputers <-
  testComputers %>% dplyr::select(-c(X, cd, multi, premium, trend))

Now, lets have a look at the scaled training dataset containing (5007 data points). For the shake of brevity we are displaying first six rows.

trainComputers <- scale(trainComputers) 

metric_list <- colnames(trainComputers)
scale_attr <- attributes(trainComputers)

trainComputers1 <- trainComputers %>% as.data.frame() %>% round(4)
Table(head(trainComputers1))
price speed hd ram screen ads
-1.2977 -1.1952 -1.3134 -0.7181 -0.6148 -2.3877
-0.7999 -0.7832 -1.2896 -1.1092 -0.6148 -2.3877
-1.1362 -1.1952 -0.8853 -0.7181 0.5490 -2.3877
-0.7091 -1.1952 -0.8853 0.0641 -0.6148 -2.3877
1.7228 -0.7832 -0.0766 1.6285 -0.6148 -2.3877
2.3956 0.9161 -0.0766 1.6285 -0.6148 -2.3877

Now, lets have a look at the scaled testing dataset containing (1252 data points). For the shake of brevity we are displaying first six rows.

testComputers <- scale(testComputers, center = scale_attr$`scaled:center`, scale = scale_attr$`scaled:scale`) 
testComputers1 <- testComputers %>% as.data.frame() %>% round(4)
Table(head(testComputers1))
price speed hd ram screen ads
5008 -1.2287 -0.7832 -0.6760 -0.7181 0.5490 -0.8403
5009 1.3848 0.0922 3.0631 3.1928 0.5490 -0.8403
5010 -0.8016 0.0922 -0.6760 -0.7181 -0.6148 -0.8403
5011 0.2311 2.6668 -0.4096 -0.7181 -0.6148 -0.8403
5012 0.3084 0.9161 1.7311 1.6285 0.5490 -0.8403
5013 -0.5072 0.9161 3.0631 0.0641 -0.6148 -0.8403

As we are familiar with the structure of the computers data, we will now follow the following steps to get the predictions using the Computers dataset.

7.0.1 Step 1: Data Compression

For more detailed information on Data Compression please refer to section 2 of this vignette.

Model Parameters

set.seed(240)
hvt.results <- list()
hvt.results <- muHVT::HVT(trainComputers,   
                          n_cells = 1001,
                          depth = 1,
                          quant.err = 0.1,
                          projection.scale = 10,
                          normalize = F,
                          distance_metric = "L1_Norm",
                          error_metric = "max",
                          quant_method = "kmeans",
                          diagnose = F)

Now let’s check the compression summary. The table below shows no of cells, no of cells having quantization error below threshold and percentage of cells having quantization error below threshold for each level.

compressionSummaryTable(hvt.results[[3]]$compression_summary)
segmentLevel noOfCells noOfCellsBelowQuantizationError percentOfCellsBelowQuantizationErrorThreshold parameters
1 1001 831 0.83 n_cells: 1001 quant.err: 0.1 distance_metric: L1_Norm error_metric: max quant_method: kmeans

As it can be seen from the table above, 83% of the cells have reached the quantization threshold error. Since we are successfully able to compress 83% of the data, so we will not further subdivide the cells

hvt.results[[3]] gives us detailed information about the hierarchical vector quantized data.

hvt.results[[3]][['summary']] gives a nice tabular data containing no of points, Quantization Error and the codebook.

The datatable displayed below is the summary from hvt.results

summaryTable(hvt.results[[3]]$summary)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error price speed hd ram screen ads
1 1 1 3 364 0.05 -0.46 0.92 -0.95 -0.72 -0.61 0.79
1 1 2 4 381 0.1 -1.02 0.92 -0.64 -0.72 0.55 -0.68
1 1 3 4 396 0.07 -0.40 0.92 -0.61 -0.72 -0.61 -0.57
1 1 4 5 660 0.07 0.04 0.92 0.17 0.06 0.55 0.93
1 1 5 2 591 0.01 -0.20 0.92 -0.50 0.06 0.55 0.08
1 1 6 4 741 0.05 1.28 -0.78 -0.41 -0.72 2.88 -0.38
1 1 7 7 720 0.06 0.02 0.92 0.85 0.06 0.55 1.52
1 1 8 5 905 0.08 1.82 0.92 0.68 1.63 0.55 0.28
1 1 9 5 594 0.03 -0.24 0.92 0.31 0.06 -0.61 0.07
1 1 10 2 391 0.02 -0.99 0.92 -0.08 -0.72 -0.61 1.01
1 1 11 5 190 0.06 -1.75 -0.78 -0.08 -0.72 -0.61 -0.08
1 1 12 3 68 0.08 -1.41 -1.20 -0.96 -0.72 0.55 0.83
1 1 13 6 87 0.05 -1.20 -1.20 -0.68 -0.72 -0.61 1.52
1 1 14 8 312 0.11 -0.72 0.09 -0.62 -0.72 -0.61 -0.71
1 1 15 13 987 0.16 1.49 0.09 3.06 3.19 0.55 -0.92
1 1 16 4 589 0.03 0.02 0.09 -0.08 0.06 0.55 0.72
1 1 17 8 647 0.1 0.50 0.09 -0.45 0.06 0.55 -1.67
1 1 18 2 37 0.04 -2.00 -0.78 -1.21 -1.11 -0.61 0.82
1 1 19 7 443 0.06 0.11 0.92 -0.64 -0.72 -0.61 -0.47
1 1 20 6 723 0.02 0.24 -0.78 0.82 1.63 -0.61 -0.30
1 1 21 5 102 0.07 0.32 -0.78 -0.81 -0.72 -0.61 -2.32
1 1 22 7 482 0.17 -0.04 2.67 -0.50 -0.72 -0.61 1.01
1 1 23 2 674 0.06 0.63 0.09 0.19 0.06 0.55 -1.08
1 1 24 7 820 0.09 0.86 0.09 0.09 0.06 2.88 0.36
1 1 25 8 714 0.14 1.15 0.92 -0.19 0.06 0.55 0.92
1 1 26 5 136 0.04 -1.53 -0.78 -0.69 -0.72 -0.61 -0.58
1 1 27 3 366 0.01 -0.71 0.92 -0.68 -0.72 -0.61 0.47
1 1 28 3 920 0.06 1.97 0.92 -0.08 0.06 2.88 0.72
1 1 29 3 182 0.01 -1.30 -0.78 -0.69 -0.72 -0.61 0.80
1 1 30 5 154 0.04 -0.26 0.92 -0.89 -0.72 -0.61 -2.27
1 1 31 3 360 0.08 -1.26 0.09 -0.23 -0.72 0.55 -0.45
1 1 32 7 457 0.05 -0.41 -1.20 0.33 0.06 -0.61 0.39
1 1 33 3 60 0.09 -1.82 -0.78 -0.50 -1.11 0.55 -0.16
1 1 34 8 468 0.08 -0.81 -0.78 0.23 0.06 0.55 0.45
1 1 35 6 187 0.02 -0.90 -0.78 -1.12 -0.72 -0.61 0.46
1 1 36 5 9 0.08 -0.60 -1.20 -0.85 -0.72 2.88 0.36
1 1 37 4 205 0.03 -1.26 -0.78 -0.65 -0.72 0.55 0.52
1 1 38 4 363 0.03 -0.68 0.92 -0.68 -0.72 -0.61 0.97
1 1 39 6 310 0.04 -0.65 0.09 -0.78 -0.72 -0.61 0.78
1 1 40 6 240 0.08 -1.41 -0.78 -0.08 -0.72 0.55 -0.51
1 1 41 2 55 0.04 -1.14 0.09 -1.18 -1.11 0.55 1.27
1 1 42 7 827 0.05 0.76 2.67 0.33 0.06 -0.61 1.52
1 1 43 3 556 0.02 -0.45 -0.78 0.82 0.06 0.55 0.07
1 1 44 9 194 0.03 -1.30 -0.78 -0.65 -0.72 -0.61 0.50
1 1 45 3 275 0.05 -1.03 0.09 -0.82 -0.72 -0.61 0.05
1 1 46 4 782 0.04 0.58 0.09 0.82 1.63 -0.61 1.01
1 1 47 9 217 0.07 -0.83 0.92 -1.18 -1.11 -0.61 0.91
1 1 48 2 988 0.05 2.89 0.92 3.06 1.63 -0.61 -1.37
1 1 49 7 506 0.04 -0.09 -0.78 0.33 0.06 -0.61 0.83
1 1 50 4 760 0.02 1.17 -0.78 0.46 1.63 -0.61 0.08
1 1 51 4 123 0.08 -1.35 -1.20 -0.75 -0.72 0.55 0.18
1 1 52 4 883 0.22 2.15 0.92 0.80 -0.13 0.55 -1.23
1 1 53 2 285 0.01 -0.97 -1.20 -0.68 0.06 -0.61 0.16
1 1 54 9 865 0.09 1.14 0.92 0.36 1.63 0.55 0.81
1 1 55 8 315 0.09 -1.16 0.92 -0.68 -0.72 -0.61 -0.73
1 1 56 2 608 0.05 -0.11 0.09 0.82 0.06 -0.61 1.27
1 1 57 5 610 0.08 0.35 0.09 0.43 0.06 -0.61 -0.44
1 1 58 1 799 0 0.38 0.09 0.82 1.63 -0.61 1.52
1 1 59 4 768 0.05 1.18 0.92 0.83 0.06 0.55 0.30
1 1 60 2 588 0.12 0.80 -0.78 -0.29 0.06 0.55 0.70
1 1 61 4 894 0.1 1.63 0.09 -0.08 0.06 2.88 0.92
1 1 62 5 168 0.05 -0.82 -0.78 -0.62 -0.72 -0.61 -1.30
1 1 63 8 878 0.1 1.13 0.92 0.07 0.06 2.88 0.37
1 1 64 2 912 0.01 1.34 1.38 0.82 1.63 -0.61 1.52
1 1 65 7 448 0.11 0.12 0.09 -0.08 -0.72 -0.61 0.61
1 1 66 3 896 0.13 1.04 0.92 -0.36 0.06 2.88 -0.87
1 1 67 2 717 0.01 0.46 -1.20 0.46 1.63 -0.61 0.08
1 1 68 8 155 0.08 -1.27 -0.78 -0.65 -0.72 0.55 -0.59
1 1 69 6 803 0.09 0.79 -0.92 0.46 1.63 -0.61 -1.67
1 1 70 7 622 0.1 0.97 0.92 -0.58 -0.72 0.55 0.39
1 1 71 3 328 0.04 -1.27 0.09 -0.08 -0.72 -0.61 0.44
1 1 72 6 505 0.04 -0.17 -0.78 0.33 0.06 -0.61 1.52
1 1 73 3 477 0.03 -0.43 -0.78 -0.08 0.06 0.55 0.91
1 1 74 3 696 0.01 -0.04 -0.78 0.82 1.63 -0.61 0.47
1 1 75 6 998 0.26 1.67 -0.07 3.06 3.19 2.88 -0.99
1 1 76 7 636 0.05 0.00 0.92 -0.08 0.06 0.55 0.28
1 1 77 5 344 0.04 -0.63 0.92 -0.68 -0.72 -0.61 1.52
1 1 78 3 895 0.01 0.13 -0.78 1.73 1.63 0.55 -1.30
1 1 79 5 755 0.04 0.14 -1.20 0.82 1.63 -0.61 1.52
1 1 80 2 367 0.01 0.22 -0.78 -0.69 0.06 -0.61 -2.23
1 1 81 3 138 0.02 -0.61 -0.78 -1.18 -1.11 -0.61 -0.44
1 1 82 3 558 0.03 0.38 0.09 -0.08 0.06 -0.61 0.08
1 1 83 7 642 0.07 -0.47 0.92 0.32 0.06 0.55 -0.46
1 1 84 5 667 0.04 -0.34 0.92 0.30 0.06 0.55 -1.30
1 1 85 3 983 0.07 1.52 0.92 3.06 3.19 0.55 0.08
1 1 86 4 885 0.07 2.73 0.92 0.81 0.06 0.55 0.10
1 1 87 6 162 0.04 -1.28 -0.78 -0.97 -0.72 -0.61 0.07
1 1 88 5 519 0.12 -0.88 0.92 0.16 -0.72 0.55 -0.56
1 1 89 8 848 0.19 -0.02 0.61 3.06 0.06 -0.61 -0.90
1 1 90 3 569 0.06 0.05 0.92 -0.23 0.06 -0.61 0.80
1 1 91 8 274 0.04 -0.48 -0.78 -0.65 -0.72 -0.61 0.90
1 1 92 2 511 0.06 -0.96 0.09 -0.08 0.06 0.55 0.08
1 1 93 1 713 0 -0.13 -1.20 0.82 1.63 -0.61 1.01
1 1 94 5 644 0.04 0.36 0.92 0.33 0.06 -0.61 0.81
1 1 95 4 626 0.09 -0.83 0.92 0.23 0.06 0.55 -0.59
1 1 96 2 5 0.09 -1.04 -0.78 -0.60 -0.91 2.88 -0.73
1 1 97 6 785 0.04 1.23 -0.78 0.82 1.63 -0.61 0.56
1 1 98 4 38 0.05 -1.71 -1.20 -1.18 -1.11 -0.61 0.83
1 1 99 3 614 0.04 0.33 0.09 -0.11 0.06 0.55 0.12
1 1 100 4 571 0.05 0.24 -0.78 0.86 0.06 -0.61 0.20
1 1 101 6 847 0.04 1.13 0.92 0.82 1.63 -0.61 0.37
1 1 102 5 211 0.01 -1.16 -0.78 -0.69 -0.72 -0.61 0.07
1 1 103 4 195 0.04 -0.42 -1.20 -1.12 -0.72 -0.61 0.56
1 1 104 5 227 0.02 -0.79 -0.78 -0.89 -0.72 -0.61 0.07
1 1 105 5 567 0.08 0.14 -0.78 0.82 0.06 -0.61 -0.55
1 1 106 9 765 0.11 0.17 2.67 0.26 0.06 -0.61 1.52
1 1 107 3 750 0.03 0.57 -0.78 0.84 1.63 -0.61 0.88
1 1 108 3 348 0.04 -0.94 0.09 -0.08 -0.72 -0.61 -0.56
1 1 109 5 612 0.04 0.56 0.09 0.33 0.06 -0.61 0.58
1 1 110 4 805 0.01 0.35 2.67 0.31 0.06 0.55 -0.30
1 1 111 4 797 0.1 0.20 -0.99 0.82 1.63 0.55 1.27
1 1 112 10 232 0.02 -0.94 -0.78 -0.68 -0.72 -0.61 0.43
1 1 113 3 921 0.01 0.97 2.67 0.82 1.63 -0.61 0.47
1 1 114 8 899 0.11 0.97 0.97 0.69 1.63 0.55 1.52
1 1 115 2 473 0.07 -0.92 0.92 -0.38 0.06 -0.61 -1.30
1 1 116 3 304 0.09 -0.84 0.09 -0.75 -0.72 0.55 -1.07
1 1 117 7 904 0.06 1.57 0.92 0.84 1.63 0.55 0.76
1 1 118 4 637 0.06 0.31 -0.78 0.89 0.06 0.55 0.45
1 1 119 3 116 0.04 -1.71 -1.20 -0.68 -0.72 -0.61 0.19
1 1 120 5 602 0.05 -0.53 0.92 -0.08 0.06 0.55 0.46
1 1 121 7 354 0.11 -0.09 0.09 -0.70 -0.72 -0.61 -1.08
1 1 122 7 456 0.04 -0.38 -1.20 0.33 0.06 -0.61 0.79
1 1 123 2 889 0.05 1.21 0.09 0.46 1.63 0.55 -1.37
1 1 124 3 963 0.01 1.17 -0.78 3.06 3.19 -0.61 0.47
1 1 125 4 494 0.06 -0.24 -1.20 0.45 0.06 -0.61 -0.44
1 1 126 3 84 0.01 -0.46 -0.78 -0.50 -0.72 -0.61 -2.35
1 1 127 4 65 0.06 -0.89 0.09 -1.18 -1.11 -0.61 -1.37
1 1 128 8 169 0.05 -0.74 -1.20 -1.09 -0.72 -0.61 0.55
1 1 129 4 438 0.07 -0.64 -1.20 0.45 0.06 -0.61 1.01
1 1 130 11 140 0.04 -1.11 -0.78 -0.67 -0.72 -0.61 1.52
1 1 131 8 743 0.04 0.62 -0.78 0.82 1.63 -0.61 0.45
1 1 132 4 855 0.11 1.17 0.09 0.73 1.63 0.55 -0.02
1 1 133 4 326 0.03 -0.89 -0.78 0.33 -0.72 -0.61 -0.33
1 1 134 7 322 0.05 -0.03 -0.78 -0.60 -0.72 -0.61 -0.44
1 1 135 1 133 0 -0.82 -1.20 -1.12 -0.72 -0.61 -0.44
1 1 136 7 418 0.1 0.10 0.92 -0.75 -0.72 -0.61 -1.25
1 1 137 5 655 0.04 -0.11 0.92 0.32 0.06 0.55 0.43
1 1 138 7 427 0.05 -0.41 -0.78 -0.08 0.06 -0.61 0.93
1 1 139 2 813 0.01 0.58 0.09 0.82 1.63 -0.61 1.52
1 1 140 4 176 0.02 -0.49 -0.78 -1.18 -1.11 -0.61 0.07
1 1 141 6 13 0.07 -0.97 -0.99 -0.89 -0.72 0.55 -2.35
1 1 142 3 752 0.02 0.60 -1.20 0.82 1.63 -0.61 0.58
1 1 143 5 570 0.1 0.07 0.92 -0.39 0.06 -0.61 -1.08
1 1 144 1 810 0 1.55 -0.78 0.82 1.63 -0.61 0.24
1 1 145 6 673 0.07 0.30 0.92 0.82 0.06 -0.61 -0.59
1 1 146 8 767 0.03 0.91 -0.78 0.82 1.63 -0.61 0.58
1 1 147 2 114 0.02 -0.81 -1.20 -1.18 -1.11 -0.61 0.37
1 1 148 8 196 0.03 -0.92 -1.20 -0.68 -0.72 -0.61 0.29
1 1 149 4 340 0.09 -1.30 -0.99 -0.08 0.06 -0.61 -0.11
1 1 150 7 759 0.08 0.63 -0.96 0.82 1.63 -0.61 -0.42
1 1 151 7 631 0.08 0.03 0.92 0.40 0.06 -0.61 1.01
1 1 152 5 479 0.05 -0.17 0.92 -0.08 -0.72 -0.61 0.07
1 1 153 2 449 0.04 -1.30 0.09 0.90 -0.72 0.55 -1.07
1 1 154 5 265 0.06 -1.09 -0.78 -0.08 -0.72 -0.61 -0.49
1 1 155 1 990 0 2.90 0.92 0.32 4.76 0.55 0.50
1 1 156 5 319 0.06 -1.03 -0.78 -0.08 -0.72 0.55 0.23
1 1 157 6 222 0.1 -0.96 0.09 -0.71 -0.72 -0.61 -1.27
1 1 158 7 706 0.08 0.20 0.92 0.86 0.06 0.55 -0.19
1 1 159 2 458 0.04 -0.63 0.09 0.30 -0.72 0.55 1.27
1 1 160 3 268 0.02 -0.83 -0.78 -0.08 -0.72 -0.61 1.52
1 1 161 9 331 0.08 -0.26 -0.78 -0.63 -0.72 0.55 0.80
1 1 162 4 440 0.06 0.06 0.09 -0.08 -0.72 -0.61 1.27
1 1 163 5 390 0.09 0.43 -0.87 -0.59 0.06 -0.61 -2.38
1 1 164 5 801 0.04 1.28 0.09 0.46 1.63 -0.61 0.08
1 1 165 7 977 0.27 1.37 2.67 0.68 1.63 2.88 0.55
1 1 166 3 225 0.08 -0.63 0.92 -1.15 -0.98 -0.61 -1.08
1 1 167 4 970 0.01 1.30 -0.78 3.06 3.19 -0.61 -0.30
1 1 168 3 902 0.15 1.72 -0.49 -0.08 1.63 -0.61 -2.39
1 1 169 4 103 0.03 -1.60 -1.20 -0.89 -0.72 -0.61 0.60
1 1 170 2 431 0.04 -0.97 0.09 -0.08 0.06 -0.61 1.27
1 1 171 7 19 0.19 -1.24 -0.96 -1.20 -1.05 0.55 -1.33
1 1 172 6 150 0.09 -0.38 -0.92 -0.68 -0.72 -0.61 -1.67
1 1 173 7 173 0.03 -0.68 -1.20 -1.12 -0.72 -0.61 0.16
1 1 174 7 422 0.11 -0.07 0.09 -0.67 -0.72 0.55 0.43
1 1 175 3 806 0.05 0.60 2.67 0.86 0.06 -0.61 0.47
1 1 176 9 826 0.14 0.58 0.92 -0.12 0.06 2.88 0.30
1 1 177 5 95 0.07 -0.98 -0.78 -1.18 -1.11 0.55 0.62
1 1 178 2 726 0.01 0.08 -1.20 0.82 1.63 -0.61 1.01
1 1 179 5 455 0.03 -0.48 -0.78 0.32 0.06 -0.61 1.52
1 1 180 6 224 0.09 -1.41 0.09 -0.65 -0.72 -0.61 -0.56
1 1 181 7 886 0.05 0.50 -0.78 1.73 1.63 0.55 -0.68
1 1 182 3 583 0.06 0.49 0.09 -0.11 0.06 -0.61 -1.08
1 1 183 3 897 0.01 0.33 -0.78 1.73 1.63 0.55 -1.30
1 1 184 5 851 0.1 1.04 0.09 0.83 1.63 0.55 0.63
1 1 185 5 26 0.09 -1.34 -1.20 -1.01 -0.87 -0.61 -1.67
1 1 186 6 2 0.1 -1.48 -1.13 -1.29 -1.11 -0.61 -2.29
1 1 187 3 462 0.03 0.37 0.92 -0.68 -0.72 -0.61 0.54
1 1 188 8 59 0.05 -1.52 -1.20 -1.18 -1.11 -0.61 0.40
1 1 189 6 428 0.11 0.86 -0.78 -0.24 -0.72 -0.61 0.66
1 1 190 1 719 0 -0.14 2.67 0.33 -0.72 0.55 -0.30
1 1 191 8 424 0.07 -0.64 0.92 -0.60 -0.72 0.55 0.69
1 1 192 6 352 0.04 -0.33 0.09 -0.68 -0.72 -0.61 0.41
1 1 193 2 828 0.01 0.84 0.09 0.82 1.63 -0.61 1.52
1 1 194 4 611 0.09 0.33 0.09 -0.18 0.06 0.55 0.75
1 1 195 6 537 0.09 0.15 0.92 -0.55 -0.72 0.55 0.18
1 1 196 7 833 0.16 0.02 0.92 0.30 0.06 2.88 -0.55
1 1 197 6 563 0.14 0.47 0.09 -0.18 0.06 -0.61 0.87
1 1 198 6 650 0.09 0.88 0.92 -0.22 0.06 -0.61 0.88
1 1 199 8 942 0.19 0.81 1.03 0.59 1.63 2.88 0.08
1 1 200 5 164 0.05 -1.34 -0.78 -0.08 -0.72 -0.61 1.52
1 1 201 4 972 0.05 1.17 -0.78 3.06 3.19 0.55 0.27
1 1 202 9 216 0.03 -1.12 -0.78 -0.67 -0.72 -0.61 0.47
1 1 203 8 351 0.09 -0.79 0.09 -0.61 -0.72 0.55 0.74
1 1 204 5 686 0.04 0.43 0.92 0.32 0.06 0.55 0.17
1 1 205 4 284 0.04 -0.92 -0.78 -0.08 -0.72 -0.61 1.01
1 1 206 5 817 0.04 1.18 0.09 0.82 1.63 -0.61 0.12
1 1 207 6 547 0.04 0.39 -0.78 0.33 0.06 -0.61 0.56
1 1 208 5 298 0.03 -1.17 -0.78 0.33 -0.72 -0.61 -0.33
1 1 209 5 818 0.04 0.56 0.92 0.82 1.63 -0.61 -0.30
1 1 210 6 27 0.04 -0.58 -1.20 -1.08 -0.72 -0.61 -2.29
1 1 211 10 214 0.06 -0.67 -1.20 -0.65 -0.72 -0.61 -0.44
1 1 212 6 486 0.06 -0.37 -0.78 -0.08 0.06 0.55 0.58
1 1 213 7 188 0.05 -1.10 -0.78 -0.68 -0.72 -0.61 -0.53
1 1 214 3 516 0.03 0.37 -0.78 -0.08 0.06 -0.61 -1.08
1 1 215 6 888 0.17 0.41 0.92 1.02 0.06 2.88 0.14
1 1 216 4 712 0.02 0.17 -0.78 0.82 1.63 -0.61 0.44
1 1 217 7 613 0.07 0.57 0.09 0.33 0.06 -0.61 0.13
1 1 218 4 179 0.11 -0.49 -0.89 -0.79 -0.72 0.55 1.52
1 1 219 12 598 0.05 -0.23 0.92 0.32 0.06 -0.61 -0.38
1 1 220 6 518 0.03 0.04 -0.78 0.33 0.06 -0.61 0.04
1 1 221 7 296 0.06 -0.90 -0.78 -0.08 -0.72 -0.61 0.39
1 1 222 10 145 0.04 -1.32 -0.78 -0.69 -0.72 -0.61 -0.77
1 1 223 7 681 0.04 0.30 0.92 0.34 0.06 0.55 0.40
1 1 224 8 172 0.07 -1.62 -0.78 -0.08 -0.72 -0.61 -0.67
1 1 225 5 98 0.03 -1.44 -1.20 -1.15 -0.72 -0.61 0.30
1 1 226 4 200 0.03 -0.75 -0.78 -1.12 -0.72 -0.61 0.66
1 1 227 6 99 0.03 -1.26 -1.20 -1.14 -0.72 -0.61 0.85
1 1 228 2 682 0.01 0.92 0.92 -0.08 0.06 -0.61 1.52
1 1 229 1 815 0 0.47 2.67 0.30 0.06 0.55 -0.30
1 1 230 1 679 0 -0.36 0.92 1.78 0.06 -0.61 0.07
1 1 231 5 389 0.07 -0.20 -0.78 -0.65 0.06 -0.61 -0.01
1 1 232 4 695 0.16 -0.17 -0.78 0.24 0.06 2.88 -0.45
1 1 233 3 121 0 -1.65 -1.20 -0.68 -0.72 -0.61 0.50
1 1 234 1 325 0 0.21 -0.78 -0.89 0.06 -0.61 -2.29
1 1 235 9 257 0.05 0.10 0.92 -1.04 -0.72 -0.61 -2.28
1 1 236 5 949 0.32 1.24 -0.78 4.82 -0.09 -0.38 0.49
1 1 237 4 846 0.08 1.01 0.92 -0.07 1.63 0.55 -0.32
1 1 238 8 85 0.05 -1.55 -1.20 -1.15 -0.72 -0.61 0.52
1 1 239 6 577 0.02 -0.52 0.92 0.32 0.06 -0.61 -0.30
1 1 240 12 57 0.04 -0.55 -0.78 -0.89 -0.72 -0.61 -2.30
1 1 241 5 41 0.15 -0.63 -0.95 -1.11 0.06 -0.61 -2.37
1 1 242 7 689 0.12 0.63 0.92 -0.32 0.06 0.55 -1.08
1 1 243 3 676 0.04 0.82 0.92 0.33 0.06 -0.61 -0.44
1 1 244 1 403 0 -1.81 -0.78 0.82 0.06 -0.61 0.07
1 1 245 9 857 0.11 2.39 0.92 0.76 0.06 0.55 0.45
1 1 246 3 463 0.07 -0.99 0.92 -0.08 -0.72 0.55 0.17
1 1 247 3 930 0.05 1.16 2.67 0.82 1.63 -0.61 -0.40
1 1 248 2 849 0.02 0.24 -0.78 1.73 1.63 0.55 0.07
1 1 249 5 115 0.04 -1.54 -1.20 -0.85 -0.72 -0.61 0.07
1 1 250 1 538 0 0.07 -0.78 0.90 -0.72 0.55 0.63
1 1 251 3 819 0.07 1.11 0.09 -0.08 -0.72 2.88 1.10
1 1 252 3 107 0.01 -1.35 -1.20 -1.12 -0.72 -0.61 0.05
1 1 253 4 192 0.06 -0.63 0.09 -1.18 -1.11 -0.61 -0.44
1 1 254 1 24 0 -0.13 0.92 -1.29 -1.11 -0.61 -2.39
1 1 255 5 309 0.04 -0.70 0.09 -0.68 -0.72 -0.61 1.01
1 1 256 3 724 0.12 0.65 0.92 0.87 -0.20 0.55 0.68
1 1 257 8 446 0.1 -0.59 -0.78 0.28 0.06 -0.61 -0.32
1 1 258 2 747 0.06 1.22 0.92 0.85 0.06 -0.61 -0.85
1 1 259 6 160 0.04 -1.62 -0.78 -0.69 -0.72 -0.61 0.39
1 1 260 7 413 0.05 -0.12 0.92 -0.63 -0.72 -0.61 0.79
1 1 261 5 177 0.12 -0.44 0.92 -1.05 -0.87 -0.61 -1.67
1 1 262 5 295 0.06 -1.04 0.92 -1.00 -0.72 -0.61 0.77
1 1 263 1 355 0 -0.63 -0.78 0.33 -0.72 -0.61 -0.30
1 1 264 5 802 0.11 1.78 0.92 0.62 0.06 0.55 0.56
1 1 265 3 777 0.04 0.96 -1.20 0.82 1.63 -0.61 0.33
1 1 266 4 276 0.16 -0.17 -0.89 -0.61 -0.72 0.55 -1.58
1 1 267 3 959 0.01 0.97 -0.78 3.06 3.19 -0.61 0.47
1 1 268 2 955 0.09 1.01 2.67 1.75 1.63 0.55 -0.39
1 1 269 3 137 0.05 -1.75 -0.78 -0.70 -0.72 -0.61 -0.05
1 1 270 6 210 0.03 -0.71 -0.78 -1.12 -0.72 -0.61 0.27
1 1 271 3 342 0.08 -0.87 -0.78 0.05 -0.72 0.55 0.63
1 1 272 4 124 0.03 -0.81 -0.78 -1.18 -1.11 0.55 0.11
1 1 273 5 410 0.08 -0.50 0.09 -0.69 0.06 -0.61 0.58
1 1 274 4 223 0.03 -0.82 -0.78 -0.89 -0.72 -0.61 0.52
1 1 275 3 165 0.01 -0.64 -0.78 -1.18 -1.11 -0.61 0.24
1 1 276 4 387 0.04 -0.85 0.09 0.31 -0.72 -0.61 0.83
1 1 277 11 253 0.04 -0.31 -0.78 -1.12 -0.72 -0.61 0.11
1 1 278 4 405 0.15 0.69 -0.78 -0.31 -0.72 -0.61 -0.60
1 1 279 2 167 0.01 -1.32 -0.78 -0.89 -0.72 -0.61 0.56
1 1 280 7 738 0.12 1.05 0.92 0.46 0.06 0.55 0.76
1 1 281 7 229 0.04 -0.65 -1.20 -0.68 -0.72 -0.61 0.57
1 1 282 4 733 0.02 0.38 -1.20 0.82 1.63 -0.61 0.37
1 1 283 6 461 0.06 -0.76 0.92 0.32 -0.72 -0.61 -0.26
1 1 284 1 925 0 2.23 0.92 0.68 1.63 0.55 0.87
1 1 285 4 690 0.04 1.17 0.92 0.33 0.06 -0.61 0.37
1 1 286 5 498 0.08 0.34 -0.78 -0.16 0.06 -0.61 -1.67
1 1 287 6 737 0.15 1.18 0.92 0.34 0.06 0.55 -0.03
1 1 288 7 578 0.04 0.17 0.09 0.33 0.06 -0.61 0.82
1 1 289 7 62 0.04 -1.25 -1.20 -1.18 -1.11 -0.61 0.86
1 1 290 6 835 0.03 1.34 0.09 0.86 1.63 -0.61 0.83
1 1 291 7 728 0.17 0.12 0.09 -0.16 0.06 2.88 0.64
1 1 292 8 416 0.06 -0.08 0.92 -0.70 -0.72 -0.61 0.38
1 1 293 3 39 0.05 -1.43 -1.20 -1.25 -0.72 -0.61 -1.08
1 1 294 10 703 0.15 0.38 -0.78 -0.14 0.06 2.88 0.76
1 1 295 8 259 0.04 -0.42 -0.78 -0.60 0.06 -0.61 -2.30
1 1 296 3 280 0.13 -0.70 0.92 -1.02 -0.98 0.55 -1.35
1 1 297 7 252 0.04 -0.46 -1.20 -0.68 -0.72 -0.61 0.15
1 1 298 2 934 0.06 1.04 2.67 0.82 1.63 0.55 0.08
1 1 299 3 112 0.02 -1.20 -0.78 -1.18 -1.11 -0.61 0.05
1 1 300 4 437 0.02 0.12 0.92 -0.68 -0.72 -0.61 0.60
1 1 301 3 258 0.03 -1.19 0.92 -0.69 -0.72 -0.61 1.52
1 1 302 5 776 0.03 0.11 -0.78 1.73 1.63 -0.61 0.07
1 1 303 4 40 0.04 -0.88 -0.78 -0.08 -0.72 2.88 0.47
1 1 304 6 141 0.05 -1.34 -1.20 -0.65 -0.72 -0.61 0.74
1 1 305 11 874 0.05 1.57 0.92 0.83 1.63 -0.61 0.71
1 1 306 4 78 0.03 -0.29 0.92 -1.10 -0.72 -0.61 -2.31
1 1 307 12 860 0.08 1.43 0.92 0.67 1.63 -0.61 0.07
1 1 308 4 368 0.07 -0.79 0.09 -0.08 -0.72 -0.61 0.18
1 1 309 9 94 0.04 -1.24 -0.78 -1.18 -1.11 -0.61 0.83
1 1 310 8 954 0.18 1.52 0.92 0.63 1.63 2.88 1.01
1 1 311 3 143 0.03 -0.78 -1.20 -1.14 -0.72 -0.61 0.87
1 1 312 3 442 0.05 -0.30 0.09 0.32 -0.72 -0.61 0.37
1 1 313 4 3 0.09 -0.56 -1.20 -1.06 -0.72 2.88 1.04
1 1 314 2 453 0.04 -0.30 1.38 -0.68 -0.72 0.55 1.27
1 1 315 9 249 0.03 -0.80 -0.78 -0.63 -0.72 -0.61 0.56
1 1 316 3 909 0.03 1.89 0.92 0.45 1.63 0.55 0.85
1 1 317 2 879 0.01 0.30 0.92 0.81 1.63 0.55 -1.30
1 1 318 3 1001 0.04 5.26 0.92 4.01 4.76 2.88 0.46
1 1 319 4 316 0.03 -0.49 0.92 -1.18 -1.11 -0.61 0.05
1 1 320 8 218 0.05 -0.76 -1.20 -0.68 -0.72 -0.61 0.10
1 1 321 3 441 0.06 -0.28 0.09 0.32 -0.72 -0.61 0.85
1 1 322 2 616 0.02 -0.85 0.92 0.87 0.06 -0.61 -0.73
1 1 323 6 358 0.04 -0.24 0.09 -0.68 -0.72 -0.61 0.79
1 1 324 4 324 0.05 -0.68 0.09 -0.65 -0.72 -0.61 0.44
1 1 325 6 796 0.07 1.12 -0.85 0.82 1.63 -0.61 -0.44
1 1 326 2 841 0.07 0.22 1.15 -0.08 1.63 0.55 1.52
1 1 327 6 435 0.03 0.13 0.92 -0.68 -0.72 -0.61 0.83
1 1 328 5 147 0.04 -1.18 -1.20 -0.85 -0.72 -0.61 0.07
1 1 329 4 951 0.12 2.65 0.92 0.72 0.06 2.88 1.04
1 1 330 5 804 0.08 1.22 0.92 -0.08 0.06 0.55 -2.18
1 1 331 5 770 0.04 0.48 -0.78 0.82 1.63 -0.61 1.52
1 1 332 4 302 0.05 0.44 -0.78 -0.69 -0.72 2.88 0.89
1 1 333 1 198 0 -0.30 -0.78 -1.18 -1.11 -0.61 0.24
1 1 334 6 207 0.06 -1.23 0.09 -1.01 -0.72 -0.61 0.75
1 1 335 2 4 0.02 -1.31 -0.78 -0.50 -1.11 2.88 0.07
1 1 336 6 135 0.02 -0.96 -0.78 -1.18 -1.11 -0.61 0.05
1 1 337 4 202 0.04 -0.77 -1.20 -0.68 -0.72 0.55 0.82
1 1 338 1 504 0 -0.95 0.92 0.87 -0.72 -0.61 -1.30
1 1 339 4 836 0.04 0.75 0.92 0.82 1.63 -0.61 1.01
1 1 340 6 709 0.1 1.16 0.92 -0.16 0.06 0.55 0.30
1 1 341 7 566 0.1 -0.55 -0.78 0.82 0.06 0.55 -0.88
1 1 342 7 565 0.02 -0.54 0.92 0.31 0.06 -0.61 0.07
1 1 343 9 892 0.1 0.53 2.67 0.75 0.06 0.55 1.52
1 1 344 3 347 0.04 -1.03 0.92 -0.57 -0.72 -0.61 0.49
1 1 345 3 101 0.06 -1.14 -1.20 -1.04 -0.72 0.55 0.59
1 1 346 2 843 0.04 0.54 0.09 0.82 1.63 0.55 1.27
1 1 347 6 166 0.04 -0.93 -0.78 -1.13 -0.72 -0.61 0.88
1 1 348 4 867 0.07 0.71 0.92 0.32 0.06 2.88 0.73
1 1 349 8 595 0.06 0.19 0.92 -0.57 0.06 -0.61 -2.28
1 1 350 8 69 0.1 -0.76 1.15 -1.18 -1.11 -0.61 1.52
1 1 351 8 800 0.15 0.53 -0.94 0.77 1.63 0.55 -0.22
1 1 352 3 554 0.07 0.40 0.09 -0.08 -0.72 0.55 1.10
1 1 353 4 546 0.07 -0.23 0.09 0.33 0.06 -0.61 0.95
1 1 354 8 771 0.11 -0.22 0.92 1.77 0.06 0.55 -0.96
1 1 355 6 483 0.05 -0.23 -1.20 0.33 0.06 -0.61 0.07
1 1 356 3 939 0.19 1.90 0.08 0.87 0.06 2.88 -1.08
1 1 357 9 982 0.38 3.40 0.08 3.38 1.63 -0.61 0.74
1 1 358 2 597 0 -0.46 0.92 0.30 0.06 -0.61 -1.30
1 1 359 5 499 0.03 -0.56 -0.78 0.82 0.06 -0.61 0.07
1 1 360 3 573 0.05 1.21 -0.78 -0.08 0.06 -0.61 0.50
1 1 361 3 596 0.06 -1.13 0.92 -0.08 0.06 0.55 -0.92
1 1 362 10 649 0.08 0.07 0.92 0.38 0.06 -0.61 1.52
1 1 363 3 392 0.02 -0.66 -0.78 -0.56 0.06 0.55 0.87
1 1 364 5 215 0.02 -0.79 -0.78 -0.89 -0.72 -0.61 0.88
1 1 365 4 745 0.16 1.86 -0.13 0.45 0.06 0.55 0.26
1 1 366 2 76 0.03 -1.68 -1.20 -1.17 -0.72 -0.61 0.16
1 1 367 9 289 0.07 -0.42 -0.78 -0.57 -0.72 -0.61 -0.40
1 1 368 4 707 0.09 0.66 0.92 -0.40 0.06 0.55 -1.67
1 1 369 5 300 0.05 -0.19 -0.78 -0.65 -0.72 -0.61 0.90
1 1 370 2 49 0.02 -1.35 -1.20 -1.18 -1.11 -0.61 -0.44
1 1 371 11 844 0.2 0.49 0.96 0.75 1.63 0.55 0.34
1 1 372 10 975 0.08 1.53 0.92 3.06 3.19 -0.61 0.27
1 1 373 4 887 0.09 1.87 0.92 0.78 1.63 -0.61 0.26
1 1 374 3 189 0.03 -0.48 -1.20 -1.12 -0.72 -0.61 0.18
1 1 375 2 671 0.08 1.11 0.92 -0.62 -0.72 0.55 -1.37
1 1 376 7 648 0.06 -0.39 0.92 0.82 0.06 -0.61 -1.30
1 1 377 3 472 0.09 0.10 -1.06 0.03 0.06 -0.61 -1.67
1 1 378 2 294 0.03 -0.80 -1.20 -0.68 0.06 -0.61 0.87
1 1 379 5 412 0.1 0.86 -0.78 -0.36 -0.72 -0.61 0.13
1 1 380 4 402 0.04 -0.29 2.67 -0.68 -0.72 -0.61 0.07
1 1 381 8 915 0.13 1.19 0.09 -0.55 -0.33 2.88 -2.30
1 1 382 4 129 0.03 -1.32 -0.78 -1.13 -0.72 -0.61 0.87
1 1 383 3 672 0.06 0.91 0.09 -0.08 0.06 0.55 1.18
1 1 384 2 1000 0.07 2.26 0.92 8.30 1.63 0.55 0.27
1 1 385 5 697 0.07 -0.43 0.92 0.86 0.06 0.55 -1.30
1 1 386 3 46 0.03 -1.26 -1.20 -1.18 -1.11 0.55 0.12
1 1 387 5 501 0.14 1.16 0.59 -0.92 -0.72 -0.61 -2.32
1 1 388 3 206 0.03 -0.60 -0.78 -1.13 -0.72 -0.61 0.87
1 1 389 8 997 0.23 2.05 2.67 3.06 3.19 -0.18 0.32
1 1 390 3 620 0.07 0.15 0.09 0.03 0.06 0.55 -0.44
1 1 391 6 151 0.04 -1.27 -0.78 -0.89 -0.72 -0.61 0.96
1 1 392 3 393 0.02 -0.38 0.92 -0.68 -0.72 -0.61 0.40
1 1 393 5 439 0.09 -0.21 -0.78 -0.58 0.06 0.55 -0.99
1 1 394 19 792 0.15 0.60 -0.91 0.83 1.63 0.55 0.57
1 1 395 7 334 0.12 0.61 -0.96 -0.77 -0.72 2.88 0.47
1 1 396 5 684 0.06 0.56 0.92 0.85 0.06 -0.61 0.37
1 1 397 11 250 0.03 -0.76 -0.78 -0.67 -0.72 -0.61 0.31
1 1 398 4 627 0.08 0.28 0.09 0.82 0.06 -0.61 0.47
1 1 399 5 362 0.1 -0.43 -0.78 -0.62 0.06 -0.61 -0.99
1 1 400 6 246 0.05 -0.27 -0.78 -1.13 -0.72 -0.61 0.75
1 1 401 3 287 0.07 -1.23 -0.92 0.32 -0.72 -0.61 0.59
1 1 402 3 236 0.02 -0.39 -1.20 -0.89 -0.72 -0.61 0.41
1 1 403 4 97 0.07 -1.65 -1.20 -0.61 -0.72 -0.61 -0.41
1 1 404 4 944 0.18 0.31 0.92 2.10 0.06 2.88 -0.90
1 1 405 2 507 0.01 -0.09 -0.78 0.33 0.06 -0.61 0.63
1 1 406 3 8 0.04 -0.69 -0.78 -0.08 -0.72 2.88 1.52
1 1 407 9 305 0.05 -0.59 -0.78 -0.63 -0.72 0.55 0.44
1 1 408 5 365 0.16 -0.24 2.67 -0.88 -0.87 0.55 1.32
1 1 409 7 790 0.11 0.03 0.92 0.10 0.06 2.88 0.43
1 1 410 4 267 0.06 -1.50 -0.78 -0.08 -0.72 0.55 0.27
1 1 411 4 781 0.02 1.09 -0.78 0.85 1.63 -0.61 0.82
1 1 412 6 680 0.07 0.65 0.92 -0.08 0.06 0.55 0.11
1 1 413 6 991 0.07 2.13 0.92 3.06 3.19 0.55 -0.62
1 1 414 5 961 0.18 1.57 0.92 3.22 1.63 0.55 0.79
1 1 415 4 704 0.12 -0.50 0.92 -0.13 -0.72 2.88 -0.73
1 1 416 3 700 0.07 0.84 0.92 0.83 0.06 -0.61 0.63
1 1 417 6 623 0.03 -0.40 0.92 0.84 0.06 -0.61 0.07
1 1 418 4 927 0.06 0.99 2.67 0.82 1.63 -0.61 -0.21
1 1 419 7 976 0.04 1.04 -0.78 3.06 3.19 0.55 -0.81
1 1 420 9 130 0.04 -0.96 -0.78 -1.18 -1.11 -0.61 0.58
1 1 421 6 288 0.04 -0.07 -1.20 -0.53 0.06 -0.61 -2.29
1 1 422 4 330 0.07 -0.83 -0.89 0.32 -0.72 -0.61 0.57
1 1 423 3 434 0.07 -1.02 0.09 -0.08 0.06 -0.61 0.08
1 1 424 4 607 0.03 -0.07 0.92 -0.50 0.06 0.55 0.87
1 1 425 7 718 0.06 0.47 0.92 0.87 0.06 0.55 0.37
1 1 426 5 156 0.09 -0.88 0.09 -1.18 -1.11 0.55 0.63
1 1 427 4 966 0.16 2.70 2.67 1.77 0.06 0.55 -0.54
1 1 428 5 871 0.04 0.56 0.92 1.73 1.63 -0.61 0.07
1 1 429 6 421 0.09 -0.70 0.92 -0.63 -0.72 0.55 0.03
1 1 430 8 823 0.14 2.24 0.92 0.81 -0.72 0.55 0.35
1 1 431 4 906 0.03 0.54 -0.78 1.73 1.63 0.55 -1.30
1 1 432 2 581 0.03 -0.04 0.09 -0.08 0.06 0.55 0.14
1 1 433 2 774 0.05 1.01 -0.78 -0.05 1.63 0.55 0.87
1 1 434 4 962 0.14 2.04 0.92 0.61 1.63 2.88 -0.41
1 1 435 8 947 0.04 1.28 2.67 0.81 1.63 0.55 1.01
1 1 436 5 100 0.04 -1.05 -0.78 -1.18 -1.11 -0.61 -0.44
1 1 437 3 744 0.03 0.60 -0.78 0.82 1.63 -0.61 0.11
1 1 438 5 838 0.16 0.69 -0.43 -0.53 1.63 0.55 -2.12
1 1 439 4 48 0.02 -0.47 -0.78 -1.08 -0.72 -0.61 -2.30
1 1 440 7 662 0.04 0.68 0.92 0.33 0.06 -0.61 0.55
1 1 441 7 731 0.06 -0.28 0.92 1.78 0.06 0.55 0.07
1 1 442 6 161 0.04 -0.62 -0.78 -1.18 -1.11 -0.61 0.56
1 1 443 5 25 0.04 -0.20 2.67 -1.18 -1.11 -0.61 1.01
1 1 444 5 238 0.1 -0.52 -0.87 -1.08 -0.72 0.55 0.65
1 1 445 2 911 0.06 0.38 2.67 0.34 1.63 0.55 0.08
1 1 446 6 66 0.06 -1.61 -1.20 -1.16 -0.72 -0.61 0.91
1 1 447 4 452 0.14 -0.19 -0.89 0.22 -0.72 2.88 0.39
1 1 448 3 926 0.07 1.51 0.92 0.89 0.06 2.88 0.46
1 1 449 5 917 0.11 1.43 0.92 -0.53 1.63 0.55 -2.12
1 1 450 3 317 0.02 -0.37 -0.78 -0.41 -0.72 -0.61 0.07
1 1 451 2 981 0.07 2.73 0.92 4.82 0.06 -0.61 0.48
1 1 452 4 658 0.06 0.87 0.09 -0.25 0.06 0.55 -0.44
1 1 453 4 247 0.06 -0.50 -0.78 -0.94 -0.72 -0.61 0.63
1 1 454 3 291 0.03 -0.20 -1.20 -0.58 -0.72 -0.61 0.50
1 1 455 8 119 0.04 -1.23 -1.20 -1.12 -0.72 -0.61 0.58
1 1 456 7 645 0.09 -0.51 0.92 0.10 0.06 0.55 1.52
1 1 457 2 219 0 -1.07 -0.78 -0.68 -0.72 -0.61 0.07
1 1 458 4 72 0.06 -1.27 -0.78 -1.18 -1.11 0.55 0.26
1 1 459 3 338 0.09 0.21 -0.92 -0.63 -0.72 -0.61 0.54
1 1 460 3 83 0.04 -1.75 -1.20 -0.75 -0.72 -0.61 0.88
1 1 461 6 163 0.06 -0.94 -0.78 -0.78 -0.72 -0.61 -0.96
1 1 462 17 503 0.07 -0.56 -0.78 0.82 0.06 -0.61 -0.74
1 1 463 4 311 0.03 -0.51 0.92 -1.18 -1.11 -0.61 0.31
1 1 464 9 549 0.1 -0.15 -0.92 0.33 0.06 0.55 0.80
1 1 465 6 63 0.06 -1.10 -0.78 -0.96 -0.72 -0.61 -1.67
1 1 466 6 191 0.04 -0.83 0.09 -1.18 -1.11 -0.61 0.44
1 1 467 3 273 0.03 -0.24 -0.78 -0.62 -0.72 -0.61 -1.08
1 1 468 4 7 0.11 0.22 -0.89 -0.79 -0.72 2.88 1.52
1 1 469 5 635 0.06 0.20 0.09 0.33 0.06 0.55 0.81
1 1 470 3 845 0.06 0.38 2.67 0.81 0.06 0.55 -0.18
1 1 471 6 734 0.04 0.80 -0.78 0.46 1.63 -0.61 0.08
1 1 472 6 640 0.07 -0.11 0.92 0.82 0.06 -0.61 0.20
1 1 473 2 370 0.06 -1.47 0.92 -0.08 -0.72 -0.61 -0.27
1 1 474 4 562 0.12 0.63 0.92 -0.08 -0.72 -0.61 1.07
1 1 475 6 489 0.03 0.18 -0.78 -0.08 0.06 -0.61 0.08
1 1 476 2 822 0 0.58 2.67 0.30 0.06 0.55 -0.30
1 1 477 2 739 0.01 0.38 -1.20 0.82 1.63 -0.61 0.70
1 1 478 4 825 0.1 1.75 0.92 0.57 0.06 0.55 1.27
1 1 479 3 487 0.05 -0.19 0.92 -0.63 -0.72 0.55 -0.50
1 1 480 2 794 0.03 2.39 0.92 0.46 0.06 -0.61 0.87
1 1 481 4 91 0.07 -1.65 -0.78 -0.16 -0.72 -0.61 -1.30
1 1 482 3 75 0.09 -1.58 -0.78 0.20 -0.72 0.55 -1.30
1 1 483 5 514 0.05 0.61 0.92 -0.57 -0.72 -0.61 -0.41
1 1 484 8 544 0.11 0.19 -0.83 -0.34 0.06 0.55 -1.67
1 1 485 2 907 0.02 2.21 0.92 0.68 1.63 -0.61 0.87
1 1 486 3 914 0.1 0.73 0.92 0.86 0.06 2.88 1.35
1 1 487 5 464 0.13 0.59 -0.78 -0.28 -0.72 0.55 0.24
1 1 488 6 377 0.07 -0.44 -0.78 0.39 -0.72 -0.61 0.47
1 1 489 4 624 0.07 0.57 0.92 -0.19 0.06 -0.61 0.08
1 1 490 6 384 0.03 0.23 -0.78 -0.52 0.06 -0.61 -2.22
1 1 491 8 496 0.08 -0.13 0.92 -0.60 -0.72 0.55 0.53
1 1 492 9 465 0.09 0.36 0.92 -0.65 -0.72 -0.61 0.11
1 1 493 3 146 0.02 -1.17 0.09 -1.18 -1.11 -0.61 0.37
1 1 494 6 652 0.04 0.18 0.92 -0.08 0.06 0.55 0.63
1 1 495 2 125 0.01 -1.18 -1.20 -1.12 -0.72 -0.61 0.24
1 1 496 8 523 0.02 0.08 -0.78 0.33 0.06 -0.61 0.81
1 1 497 2 529 0.03 -0.63 0.92 -0.08 0.06 -0.61 0.47
1 1 498 8 6 0.04 -1.20 -0.78 -1.29 -1.11 -0.61 -2.28
1 1 499 4 580 0.02 -0.67 0.92 0.31 0.06 -0.61 -1.30
1 1 500 3 297 0.02 0.18 -0.78 -1.12 -0.72 -0.61 0.11
1 1 501 4 701 0.05 0.02 0.92 0.82 0.06 0.55 -0.73
1 1 502 4 603 0.07 1.09 0.92 -0.19 -0.72 -0.61 -0.44
1 1 503 3 117 0.03 -1.61 -0.78 -1.08 -0.72 -0.61 0.71
1 1 504 8 974 0.23 3.06 0.92 -0.06 1.63 2.88 -0.44
1 1 505 4 809 0.12 1.46 0.09 -0.38 -0.72 2.88 0.44
1 1 506 12 540 0.1 -0.24 -0.92 0.33 0.06 0.55 0.32
1 1 507 5 420 0.03 -0.49 -0.78 -0.08 0.06 -0.61 0.53
1 1 508 2 79 0.02 -2.01 -0.78 -0.50 -1.11 -0.61 0.07
1 1 509 8 508 0.02 -0.07 -0.78 0.33 0.06 -0.61 0.42
1 1 510 12 653 0.08 0.64 0.92 -0.59 0.06 -0.61 -2.28
1 1 511 10 978 0.09 1.28 -0.78 3.06 3.19 0.55 -0.58
1 1 512 1 356 0 -0.30 -1.20 -0.68 0.06 -0.61 0.87
1 1 513 7 93 0.03 -1.04 -1.20 -1.18 -1.11 -0.61 0.09
1 1 514 3 735 0.01 0.71 -1.20 0.46 1.63 -0.61 0.08
1 1 515 9 106 0.06 -0.13 0.09 -1.01 -0.72 -0.61 -2.29
1 1 516 4 666 0.02 0.57 0.09 -0.50 0.06 0.55 -2.20
1 1 517 2 346 0.07 -0.37 -0.78 0.34 -0.72 -0.61 -1.37
1 1 518 4 23 0.09 0.15 -0.89 -1.12 -0.72 2.88 0.66
1 1 519 3 746 0.03 0.35 -1.20 0.84 1.63 -0.61 0.96
1 1 520 5 70 0.08 -1.34 -0.87 -0.69 -0.72 0.55 1.52
1 1 521 7 651 0.09 0.84 0.09 -0.18 0.06 0.55 0.73
1 1 522 8 868 0.14 0.89 0.92 -0.13 0.06 2.88 0.92
1 1 523 2 933 0.01 1.30 2.67 0.82 1.63 -0.61 1.01
1 1 524 6 756 0.1 0.68 -0.92 0.46 1.63 -0.61 -1.08
1 1 525 5 783 0.04 0.78 0.09 0.82 1.63 -0.61 0.40
1 1 526 4 90 0.07 -1.39 -0.78 -0.83 -0.72 -0.61 -1.19
1 1 527 7 665 0.1 -0.14 -0.78 0.04 0.06 2.88 0.57
1 1 528 5 877 0.14 0.95 0.09 -0.50 0.06 2.88 -1.43
1 1 529 5 43 0.17 -0.14 -0.95 -0.69 -0.72 2.88 0.12
1 1 530 2 572 0 -0.12 0.09 -0.08 0.06 0.55 0.50
1 1 531 3 808 0.07 1.80 0.92 -0.63 -0.72 0.55 -2.31
1 1 532 1 568 0 0.71 -0.78 0.33 0.06 -0.61 0.24
1 1 533 2 643 0.01 -0.29 0.92 0.82 0.06 -0.61 -0.84
1 1 534 5 467 0.08 -0.85 -0.78 0.24 0.06 0.55 -0.33
1 1 535 7 967 0.29 1.01 0.56 1.60 1.63 2.88 -0.97
1 1 536 2 30 0.04 -2.18 -0.78 -1.21 -1.11 -0.61 0.70
1 1 537 4 178 0.07 -1.25 -0.78 -0.66 -0.72 0.55 0.85
1 1 538 6 852 0.18 0.41 0.99 0.66 1.63 0.55 -0.46
1 1 539 7 793 0.13 1.01 -0.78 0.41 1.63 0.55 0.09
1 1 540 1 971 0 1.55 -0.78 3.06 3.19 -0.61 -0.30
1 1 541 4 29 0.03 -1.14 -0.78 -1.18 -1.11 -0.61 -1.67
1 1 542 9 180 0.06 -1.48 -0.78 -0.64 -0.72 -0.61 0.49
1 1 543 2 104 0.01 -1.81 -1.20 -0.68 -0.72 -0.61 0.42
1 1 544 1 918 0 2.72 0.92 0.68 1.63 -0.61 0.04
1 1 545 5 372 0.04 -0.71 0.09 -0.08 -0.72 -0.61 0.63
1 1 546 5 73 0.04 -1.49 -0.78 -1.18 -1.11 -0.61 0.82
1 1 547 3 778 0.01 1.05 -0.78 0.82 1.63 -0.61 0.04
1 1 548 4 842 0.12 0.83 0.09 0.11 0.06 2.88 1.04
1 1 549 4 910 0.06 1.73 0.92 0.65 1.63 0.55 -0.44
1 1 550 4 293 0.04 0.13 0.09 -0.69 -0.72 -0.61 -2.30
1 1 551 14 816 0.15 0.43 2.67 0.47 0.06 0.55 0.51
1 1 552 2 936 0.03 1.43 0.92 -0.60 -0.72 2.88 -2.38
1 1 553 3 52 0.06 -1.75 -0.92 -0.69 -0.72 -0.61 -1.30
1 1 554 4 510 0.1 -0.33 0.09 0.02 0.06 -0.61 0.54
1 1 555 5 203 0.03 -0.58 -0.78 -0.76 -0.72 -0.61 -1.08
1 1 556 2 730 0.01 1.29 0.92 0.32 0.06 -0.61 1.52
1 1 557 9 875 0.11 0.77 0.97 0.82 1.63 0.55 1.01
1 1 558 1 600 0 0.50 0.09 -0.08 0.06 -0.61 1.52
1 1 559 4 772 0.03 0.80 -1.20 0.85 1.63 -0.61 0.82
1 1 560 4 715 0.11 1.24 -0.78 -0.33 -0.72 2.88 0.14
1 1 561 4 485 0.09 -0.77 -0.89 0.32 0.06 0.55 0.95
1 1 562 4 702 0.03 1.29 0.92 0.32 0.06 -0.61 0.89
1 1 563 1 688 0 0.87 0.09 -0.89 0.06 0.55 -2.29
1 1 564 5 958 0.09 0.79 -0.78 1.73 1.63 2.88 -0.94
1 1 565 3 993 0.11 1.63 0.92 3.44 1.63 2.88 0.08
1 1 566 5 586 0.05 -0.41 0.09 0.82 0.06 -0.61 -1.30
1 1 567 19 471 0.13 0.23 -0.78 -0.64 0.06 0.55 -2.31
1 1 568 2 53 0.05 -0.81 0.09 -1.18 -1.11 0.55 -1.37
1 1 569 13 789 0.11 0.78 -0.94 -0.08 1.63 -0.61 -2.26
1 1 570 10 985 0.06 1.28 -0.78 3.06 3.19 0.55 -1.30
1 1 571 8 850 0.18 2.26 0.81 0.54 0.06 -0.61 -1.45
1 1 572 5 913 0.04 0.41 0.92 1.75 1.63 0.55 -0.84
1 1 573 4 552 0.07 -0.20 -0.89 0.33 0.06 0.55 -0.41
1 1 574 3 183 0.05 -1.58 -0.78 0.34 -0.72 0.55 -0.84
1 1 575 2 599 0.03 -0.40 -0.78 0.85 0.06 0.55 1.52
1 1 576 3 74 0.01 -0.30 -0.78 -0.50 -0.72 0.55 -2.35
1 1 577 2 787 0.01 1.53 0.92 0.81 0.06 -0.61 1.52
1 1 578 4 282 0.08 -1.29 0.09 -0.55 -0.72 -0.61 0.30
1 1 579 3 710 0.09 1.57 0.92 0.10 -0.72 0.55 0.57
1 1 580 2 929 0.01 1.22 2.67 0.82 1.63 -0.61 0.47
1 1 581 4 791 0.01 0.38 -0.78 1.73 1.63 -0.61 0.07
1 1 582 10 995 0.08 1.99 2.67 3.06 3.19 -0.61 -0.11
1 1 583 7 419 0.07 0.47 -0.78 -0.08 -0.72 -0.61 0.34
1 1 584 6 209 0.04 -0.73 -1.20 -0.68 -0.72 -0.61 0.83
1 1 585 4 64 0.04 -1.10 0.09 -1.18 -1.11 -0.61 1.52
1 1 586 5 968 0.03 1.35 -0.78 3.06 3.19 -0.61 0.07
1 1 587 2 10 0.03 -0.89 -0.78 -1.29 -1.11 -0.61 -2.28
1 1 588 2 605 0.14 0.72 0.09 -0.16 -0.72 0.55 -1.37
1 1 589 3 329 0.05 -1.14 0.92 -0.63 -0.72 -0.61 -0.35
1 1 590 7 957 0.29 2.55 -0.78 -0.06 1.63 2.88 -0.28
1 1 591 3 400 0.02 0.05 0.92 -1.12 -0.72 -0.61 0.12
1 1 592 2 551 0.01 0.42 -0.78 0.33 0.06 -0.61 0.87
1 1 593 4 474 0.11 0.00 -0.99 0.05 0.06 -0.61 -1.08
1 1 594 5 152 0.04 -0.53 1.38 -1.18 -1.11 -0.61 1.01
1 1 595 3 964 0.27 1.63 0.92 4.82 -0.20 -0.23 0.50
1 1 596 3 560 0.13 1.03 0.92 -0.38 -0.72 -0.61 0.18
1 1 597 2 646 0.01 0.34 0.09 -0.50 0.06 0.55 -2.23
1 1 598 7 109 0.04 -1.23 -0.78 -1.18 -1.11 -0.61 0.39
1 1 599 5 134 0.03 -0.99 -0.78 -1.18 -1.11 -0.61 0.30
1 1 600 10 451 0.04 -0.57 -0.78 0.33 0.06 -0.61 0.45
1 1 601 3 395 0.13 -0.94 0.92 -0.48 -0.72 0.55 1.35
1 1 602 14 286 0.05 -0.41 -0.78 -0.69 -0.72 -0.61 0.10
1 1 603 7 378 0.05 0.00 0.09 -0.68 -0.72 -0.61 0.57
1 1 604 7 621 0.14 -0.69 0.09 0.70 0.06 0.55 -0.87
1 1 605 6 761 0.03 0.87 -0.78 0.82 1.63 -0.61 0.14
1 1 606 8 766 0.11 0.61 2.67 0.18 0.06 -0.61 1.01
1 1 607 7 170 0.05 -1.40 -0.78 -0.65 -0.72 -0.61 -0.34
1 1 608 6 773 0.02 0.92 -0.78 0.85 1.63 -0.61 0.82
1 1 609 4 32 0.04 -0.92 -0.78 -1.10 -0.72 -0.61 -2.31
1 1 610 10 493 0.04 -0.12 -1.20 0.33 0.06 -0.61 0.76
1 1 611 5 120 0.03 -1.20 -0.78 -0.68 -0.72 -0.61 -1.30
1 1 612 2 21 0.01 -1.56 -1.20 -1.18 -1.11 -0.61 -1.08
1 1 613 1 615 0 0.67 0.09 -0.08 0.06 -0.61 1.52
1 1 614 4 837 0.05 1.51 0.09 0.82 1.63 -0.61 0.14
1 1 615 3 639 0.02 0.61 0.09 -0.08 0.06 0.55 0.07
1 1 616 5 788 0.17 0.67 -0.78 -0.49 -0.09 2.88 -1.36
1 1 617 4 840 0.06 1.13 0.09 0.46 1.63 -0.61 -1.37
1 1 618 6 181 0.04 -0.80 -0.78 -1.08 -0.72 -0.61 -0.44
1 1 619 5 515 0.16 0.11 0.09 -0.44 0.06 -0.61 -1.55
1 1 620 6 80 0.03 -1.17 -1.20 -1.18 -1.11 -0.61 0.58
1 1 621 4 530 0.11 -0.06 0.92 -0.61 0.06 -0.61 0.22
1 1 622 3 254 0.02 -0.88 -0.78 -0.54 -0.72 -0.61 0.08
1 1 623 5 406 0.09 0.18 0.92 -1.15 -0.72 -0.61 0.75
1 1 624 10 685 0.07 -0.09 0.92 0.87 0.06 0.55 0.50
1 1 625 9 989 0.12 1.40 0.92 3.06 3.19 0.55 -1.10
1 1 626 4 333 0.03 0.13 -0.78 -0.68 -0.72 -0.61 0.16
1 1 627 5 901 0.14 1.42 0.92 0.15 1.63 0.55 -1.20
1 1 628 1 940 0 0.97 2.67 0.82 1.63 -0.61 -1.30
1 1 629 4 266 0.04 -0.83 -0.78 -0.69 -0.72 0.55 0.06
1 1 630 1 495 0 0.20 -0.78 -0.08 0.06 -0.61 0.63
1 1 631 5 525 0.03 0.09 -0.78 0.33 0.06 -0.61 0.27
1 1 632 4 343 0.1 -0.97 -0.78 0.22 -0.72 0.55 -0.41
1 1 633 5 979 0.04 1.57 0.92 3.06 3.19 -0.61 -0.30
1 1 634 9 574 0.12 0.74 0.09 -0.64 0.06 -0.61 -2.31
1 1 635 11 821 0.16 0.15 2.67 0.22 0.06 0.55 1.43
1 1 636 6 337 0.06 -0.24 0.92 -1.18 -1.11 -0.61 0.56
1 1 637 6 725 0.02 0.36 -0.78 0.82 1.63 -0.61 0.42
1 1 638 5 459 0.1 -0.99 0.92 0.55 -0.72 -0.61 -0.75
1 1 639 2 359 0.03 0.21 0.09 -1.18 -0.72 -0.61 0.87
1 1 640 7 908 0.23 1.43 0.92 0.08 1.63 -0.61 -2.10
1 1 641 3 433 0.06 -0.60 0.92 -0.08 -0.72 -0.61 -0.51
1 1 642 5 654 0.07 0.93 0.09 -0.22 0.06 0.55 0.22
1 1 643 3 798 0.02 1.24 -0.78 0.86 1.63 -0.61 0.83
1 1 644 3 239 0.05 -1.27 -0.78 0.20 -0.72 -0.61 -0.84
1 1 645 5 92 0.07 -1.19 -0.78 -0.73 -0.72 0.55 -1.26
1 1 646 7 234 0.02 -0.77 -0.78 -0.68 -0.72 -0.61 1.01
1 1 647 6 445 0.08 -0.55 -1.20 0.41 0.06 -0.61 1.52
1 1 648 3 872 0.06 1.13 -0.78 0.89 0.06 2.88 0.46
1 1 649 5 327 0.09 -0.04 -1.03 -0.81 -0.72 0.55 0.51
1 1 650 3 834 0.07 0.62 2.67 0.82 0.06 -0.61 1.18
1 1 651 4 592 0.03 -0.16 -0.78 0.87 0.06 0.55 0.48
1 1 652 5 579 0.06 0.17 -0.78 0.33 0.06 0.55 0.36
1 1 653 6 349 0.05 -0.59 -0.78 -0.63 0.06 -0.61 0.87
1 1 654 3 128 0.06 -0.68 -0.92 -1.11 -0.72 -0.61 -1.08
1 1 655 11 870 0.05 0.16 -0.78 1.73 1.63 0.55 -0.74
1 1 656 9 584 0.08 -0.33 0.09 0.82 0.06 -0.61 -0.74
1 1 657 2 873 0.01 1.01 0.92 0.82 1.63 -0.61 1.52
1 1 658 5 575 0.19 0.40 -0.87 -0.16 0.06 0.55 -0.95
1 1 659 9 374 0.06 -0.06 -0.78 -0.08 -0.72 -0.61 0.53
1 1 660 4 369 0.08 -0.43 0.92 -1.18 -1.11 0.55 -0.02
1 1 661 8 956 0.23 1.20 0.92 3.21 1.63 0.40 -0.33
1 1 662 5 230 0.05 -0.76 -1.20 -0.68 -0.72 0.55 0.25
1 1 663 5 812 0.22 0.33 -0.78 3.27 0.06 -0.15 0.31
1 1 664 6 541 0.07 0.27 -0.85 0.33 0.06 -0.61 -0.44
1 1 665 1 784 0 0.13 2.67 0.30 0.06 0.55 -0.30
1 1 666 4 476 0.03 -0.35 -0.78 0.33 0.06 -0.61 0.76
1 1 667 9 864 0.08 1.46 0.92 -0.02 1.63 0.55 0.10
1 1 668 13 903 0.11 0.52 0.09 1.73 1.63 0.55 -0.93
1 1 669 2 916 0.02 0.66 -0.78 3.06 1.63 -0.61 -0.30
1 1 670 2 832 0.09 2.14 0.50 -0.12 0.06 -0.61 -2.38
1 1 671 4 722 0.09 2.18 0.09 0.46 0.06 -0.61 0.52
1 1 672 6 42 0.06 -0.35 -0.78 -0.69 -0.72 2.88 0.60
1 1 673 5 721 0.12 1.03 0.92 -0.39 -0.72 0.55 -2.32
1 1 674 7 237 0.05 -0.78 -0.78 -0.62 -0.72 -0.61 -0.52
1 1 675 5 231 0.06 -0.52 -0.78 -0.66 -0.72 -0.61 1.52
1 1 676 7 199 0.03 -1.15 -0.78 -0.68 -0.72 -0.61 0.76
1 1 677 6 159 0.03 -1.55 -0.78 -0.63 -0.72 -0.61 0.77
1 1 678 5 447 0.04 0.25 -0.78 -0.53 0.06 -0.61 0.17
1 1 679 6 429 0.11 -0.67 0.09 0.05 -0.72 0.55 0.55
1 1 680 5 394 0.07 0.18 0.09 -0.68 -0.72 -0.61 -0.44
1 1 681 5 937 0.16 2.21 0.92 0.27 0.06 2.88 1.14
1 1 682 6 609 0.03 0.50 0.09 0.33 0.06 -0.61 0.83
1 1 683 7 532 0.14 -0.40 0.92 -0.02 -0.72 0.55 0.50
1 1 684 4 56 0.04 -1.31 -0.78 -1.18 -1.11 0.55 0.85
1 1 685 7 524 0.03 0.10 -0.78 0.33 0.06 -0.61 0.57
1 1 686 4 320 0.1 -1.30 -0.99 -0.08 0.06 -0.61 1.27
1 1 687 8 839 0.2 0.88 -0.89 0.33 1.63 0.55 -1.30
1 1 688 3 272 0.08 -1.33 0.09 -0.43 -0.72 -0.61 0.93
1 1 689 6 675 0.06 -0.32 0.92 0.86 0.06 0.55 0.01
1 1 690 7 58 0.05 -0.25 -0.78 -1.08 -0.72 -0.61 -2.24
1 1 691 4 228 0.02 -0.51 0.09 -1.18 -1.11 -0.61 0.87
1 1 692 4 50 0.04 -1.22 -0.78 -1.18 -1.11 -0.61 -1.08
1 1 693 5 444 0.12 0.57 -0.78 -0.44 -0.72 0.55 -0.52
1 1 694 9 948 0.27 0.91 2.67 0.45 0.06 2.88 0.77
1 1 695 9 415 0.07 -0.46 -0.78 -0.58 0.06 0.55 0.07
1 1 696 8 693 0.11 0.46 0.97 0.29 0.06 0.55 0.87
1 1 697 1 663 0 0.45 2.67 -0.41 -0.72 -0.61 -0.84
1 1 698 8 345 0.06 -0.17 0.92 -1.18 -1.11 -0.61 0.11
1 1 699 6 244 0.04 -0.48 0.09 -1.18 -1.11 -0.61 0.58
1 1 700 3 279 0.07 0.28 -0.78 -0.58 -0.72 0.55 -2.31
1 1 701 4 670 0.09 0.69 0.09 0.74 0.06 -0.61 -1.02
1 1 702 8 361 0.06 -0.82 0.92 -0.69 -0.72 -0.61 0.09
1 1 703 3 863 0.06 0.20 0.92 0.06 0.06 2.88 1.52
1 1 704 7 659 0.05 0.64 0.92 0.33 0.06 -0.61 0.10
1 1 705 11 633 0.1 -0.52 0.92 0.32 0.06 0.55 0.23
1 1 706 4 256 0.1 -0.64 0.09 -1.18 -1.11 0.55 -0.02
1 1 707 5 550 0.09 0.56 0.09 -0.59 -0.72 0.55 -2.34
1 1 708 7 780 0.19 2.21 0.92 0.69 -0.16 -0.61 0.09
1 1 709 2 470 0.02 0.84 -0.78 -0.53 0.06 -0.61 -2.39
1 1 710 4 757 0.02 0.58 -1.20 0.85 1.63 -0.61 0.82
1 1 711 2 139 0.01 -0.99 -1.20 -1.12 -0.72 -0.61 0.24
1 1 712 5 51 0.05 -1.70 -1.20 -1.17 -0.72 -0.61 -0.48
1 1 713 8 299 0.11 -0.16 0.09 -0.77 -0.72 -0.61 -1.67
1 1 714 8 251 0.06 -0.88 -0.78 -0.66 -0.72 0.55 0.79
1 1 715 7 204 0.03 -0.76 -0.78 -1.11 -0.72 -0.61 0.05
1 1 716 6 466 0.1 -0.44 0.92 0.14 -0.72 -0.61 0.50
1 1 717 2 89 0 -1.47 -1.20 -1.18 -0.72 -0.61 0.06
1 1 718 6 548 0.03 0.37 -0.78 0.33 0.06 -0.61 0.14
1 1 719 2 1 0.03 0.84 -0.78 -0.60 -0.72 2.88 -2.38
1 1 720 6 31 0.08 -1.48 -1.20 -1.18 -1.11 0.55 0.69
1 1 721 2 20 0.02 -0.02 -1.20 -0.69 -0.72 2.88 0.89
1 1 722 3 308 0.01 -0.62 0.09 -0.89 -0.72 -0.61 0.07
1 1 723 4 859 0.03 1.09 0.09 -0.08 1.63 -0.61 -2.25
1 1 724 8 619 0.18 -0.15 0.09 0.28 0.06 0.55 1.27
1 1 725 3 54 0.07 -1.14 -0.92 -1.18 -1.11 0.55 -0.44
1 1 726 10 502 0.1 -0.58 -0.82 0.21 0.06 0.55 1.52
1 1 727 10 245 0.05 -0.77 -0.78 -0.63 -0.72 -0.61 0.80
1 1 728 4 144 0.05 -1.55 -0.78 -0.61 -0.72 -0.61 1.01
1 1 729 10 105 0.06 -0.14 -0.78 -0.63 -0.72 -0.61 -2.33
1 1 730 5 881 0.06 1.50 0.92 0.40 1.63 0.55 0.50
1 1 731 4 553 0.12 0.75 0.09 -0.38 -0.72 0.55 0.44
1 1 732 4 527 0.02 0.50 -0.78 -0.08 0.06 -0.61 0.08
1 1 733 4 996 0.16 3.40 2.67 1.77 0.06 2.88 -0.54
1 1 734 8 669 0.09 0.72 0.92 -0.47 0.06 0.55 0.47
1 1 735 4 555 0.04 -0.94 0.92 0.32 0.06 -0.61 -0.78
1 1 736 6 751 0.17 0.38 0.50 0.19 -0.72 2.88 0.46
1 1 737 4 14 0.03 -0.76 0.92 -1.29 -1.11 -0.61 -2.25
1 1 738 4 61 0.11 -0.81 1.15 -1.18 -1.11 0.55 1.27
1 1 739 3 318 0.06 0.49 0.09 -0.82 -0.72 -0.61 -2.35
1 1 740 5 281 0.1 -1.03 0.92 -0.76 -0.72 -0.61 -1.26
1 1 741 6 113 0.05 -1.39 -1.20 -0.82 -0.72 -0.61 0.91
1 1 742 7 754 0.08 1.56 0.92 0.83 0.06 -0.61 0.74
1 1 743 7 264 0.05 -1.24 -0.78 -0.08 -0.72 -0.61 0.41
1 1 744 6 88 0.05 -1.56 -0.78 -0.69 -0.72 -0.61 1.52
1 1 745 5 779 0.04 0.31 2.67 0.82 0.06 -0.61 0.07
1 1 746 5 854 0.16 2.53 -0.78 0.46 0.06 -0.61 -2.18
1 1 747 9 226 0.04 -0.94 -0.78 -0.65 -0.72 -0.61 0.81
1 1 748 5 587 0.09 -0.73 0.92 -0.08 0.06 0.55 -0.08
1 1 749 4 220 0.04 -1.41 -0.78 -0.08 -0.72 -0.61 1.01
1 1 750 1 241 0 -0.23 -0.78 -1.12 -0.72 -0.61 -0.44
1 1 751 5 807 0.04 0.56 0.92 0.82 1.63 -0.61 0.47
1 1 752 7 371 0.07 -0.68 0.09 -0.59 -0.72 0.55 0.28
1 1 753 3 193 0.08 -1.22 -0.78 0.51 -0.72 -0.61 -1.30
1 1 754 2 131 0.02 -1.23 -0.78 -1.12 -0.72 -0.61 -0.44
1 1 755 6 753 0.05 0.61 -1.20 0.82 1.63 -0.61 0.07
1 1 756 4 426 0.04 -0.13 2.67 -0.68 -0.72 -0.61 0.47
1 1 757 5 277 0.05 -0.75 0.92 -1.18 -1.11 -0.61 0.45
1 1 758 10 379 0.06 -0.02 0.09 -0.68 -0.72 -0.61 0.09
1 1 759 3 604 0.18 -0.30 2.67 -0.48 -0.72 0.55 0.52
1 1 760 3 323 0.07 -1.27 0.92 -0.54 -0.72 -0.61 0.85
1 1 761 3 898 0.02 1.02 1.38 0.82 1.63 -0.61 1.52
1 1 762 4 557 0.03 -0.46 -0.78 0.87 0.06 0.55 0.47
1 1 763 2 984 0.05 4.59 0.92 0.80 0.06 2.88 0.33
1 1 764 13 260 0.08 -0.45 0.09 -1.18 -1.11 -0.61 0.10
1 1 765 4 969 0.03 1.38 -0.78 3.06 3.19 -0.61 0.47
1 1 766 2 292 0.03 0.19 -0.78 -1.15 -0.72 -0.61 0.75
1 1 767 12 953 0.16 2.06 0.92 0.16 1.63 2.88 0.43
1 1 768 10 576 0.04 -0.59 0.92 0.31 0.06 -0.61 -0.68
1 1 769 1 509 0 1.23 0.09 -0.55 -0.72 -0.61 -1.08
1 1 770 6 727 0.13 0.31 2.67 0.26 0.06 -0.61 0.47
1 1 771 4 44 0.03 -1.52 -1.20 -1.18 -1.11 -0.61 0.97
1 1 772 5 531 0.1 0.18 0.92 -0.65 -0.72 0.55 1.01
1 1 773 4 383 0.08 -1.13 0.09 -0.08 -0.72 0.55 0.24
1 1 774 4 900 0.24 0.64 0.92 3.32 0.06 -0.03 0.38
1 1 775 3 417 0.06 0.05 -0.78 -0.58 0.06 -0.61 0.79
1 1 776 6 618 0.03 -0.01 0.09 0.34 0.06 0.55 0.37
1 1 777 5 35 0.07 -0.88 -0.78 -0.08 -0.72 2.88 -0.08
1 1 778 4 388 0.07 -0.43 -0.78 0.02 -0.72 0.55 0.36
1 1 779 6 928 0.09 0.53 0.92 1.80 1.63 0.55 -1.30
1 1 780 4 34 0.09 -2.06 -0.78 -1.21 -1.11 -0.61 -0.19
1 1 781 6 882 0.07 0.69 -0.78 -0.54 0.06 2.88 -2.28
1 1 782 3 397 0.06 0.24 0.09 -0.65 -0.72 -0.61 0.87
1 1 783 7 153 0.02 -0.73 -0.78 -1.18 -1.11 -0.61 0.06
1 1 784 1 814 0 0.63 -0.78 1.73 1.63 -0.61 0.07
1 1 785 5 380 0.1 0.07 -0.78 -0.08 -0.72 -0.61 1.19
1 1 786 4 866 0.11 -0.40 0.30 3.06 0.06 0.55 -0.84
1 1 787 6 922 0.17 0.98 0.92 1.73 1.63 0.55 -0.50
1 1 788 6 559 0.07 0.14 -0.78 0.85 0.06 -0.61 0.67
1 1 789 3 213 0.01 -1.16 -0.78 -0.68 -0.72 -0.61 0.37
1 1 790 5 401 0.04 -0.20 1.38 -0.68 -0.72 -0.61 1.01
1 1 791 3 184 0 -0.28 -0.78 -1.12 -0.72 -0.61 -1.08
1 1 792 5 952 0.02 1.35 2.67 0.81 1.63 0.55 1.52
1 1 793 1 890 0 0.55 0.92 0.81 1.63 0.55 -1.30
1 1 794 3 617 0.07 0.13 0.92 -0.56 0.06 0.55 0.18
1 1 795 3 235 0.04 -0.39 -1.20 -0.82 -0.72 -0.61 0.72
1 1 796 3 534 0.01 0.21 -0.78 0.33 0.06 -0.61 0.80
1 1 797 4 517 0.04 0.08 -1.20 0.33 0.06 -0.61 0.37
1 1 798 9 283 0.03 -0.43 -0.78 -0.68 -0.72 -0.61 0.66
1 1 799 11 593 0.09 -0.20 0.92 0.28 0.06 -0.61 0.46
1 1 800 7 306 0.04 -0.41 0.92 -1.18 -1.11 -0.61 0.81
1 1 801 2 938 0.04 1.51 2.67 1.77 0.06 -0.61 -1.07
1 1 802 7 876 0.15 1.43 0.92 0.71 1.63 -0.61 -0.62
1 1 803 7 301 0.07 -1.15 0.92 -0.94 -0.72 -0.61 0.06
1 1 804 7 526 0.06 -0.31 -0.78 0.83 0.06 -0.61 0.45
1 1 805 2 15 0.03 -0.75 -0.78 -0.69 -0.72 2.88 1.01
1 1 806 9 262 0.03 -0.60 -0.78 -0.67 -0.72 -0.61 0.78
1 1 807 3 763 0.02 0.77 -1.20 0.82 1.63 -0.61 0.58
1 1 808 4 661 0.06 -0.06 0.92 0.31 0.06 0.55 -0.21
1 1 809 6 22 0.03 -0.89 -1.20 -0.96 -0.72 -0.61 -2.36
1 1 810 2 408 0.05 -0.17 0.09 -0.59 -0.72 0.55 -1.08
1 1 811 3 992 0.07 2.01 2.67 3.06 1.63 0.55 1.18
1 1 812 4 45 0.04 -1.27 -0.78 -1.18 -1.11 -0.61 1.52
1 1 813 4 932 0.13 2.77 0.92 0.48 0.06 0.55 -2.36
1 1 814 3 543 0.02 0.32 -0.78 0.33 0.06 -0.61 0.80
1 1 815 3 632 0.07 0.54 -0.78 0.84 0.06 -0.61 -1.23
1 1 816 1 945 0 0.79 2.67 3.06 0.06 -0.61 0.07
1 1 817 5 353 0.05 -0.32 -0.78 -0.54 0.06 -0.61 -1.67
1 1 818 5 786 0.03 1.23 -0.78 0.82 1.63 -0.61 0.16
1 1 819 4 740 0.09 0.55 0.92 0.83 0.06 0.55 1.14
1 1 820 4 386 0.04 -0.42 0.92 -0.64 -0.72 -0.61 0.82
1 1 821 7 233 0.01 -0.92 -0.78 -0.68 -0.72 -0.61 0.07
1 1 822 5 335 0.07 -0.30 0.92 -1.17 -1.03 -0.61 -0.44
1 1 823 9 243 0.06 -1.37 -0.78 -0.08 -0.72 -0.61 -0.09
1 1 824 11 691 0.12 0.44 1.38 0.42 0.06 -0.61 1.29
1 1 825 3 935 0.07 1.78 2.67 1.77 0.06 -0.61 -0.28
1 1 826 7 582 0.08 0.20 0.09 -0.56 0.06 0.55 -1.08
1 1 827 7 350 0.06 -0.66 -0.78 -0.61 0.06 -0.61 0.09
1 1 828 3 521 0.04 -0.88 0.92 -0.08 0.06 -0.61 -0.51
1 1 829 3 460 0.07 0.33 -0.78 -0.08 -0.72 0.55 1.10
1 1 830 9 185 0.02 -1.37 -0.78 -0.69 -0.72 -0.61 0.07
1 1 831 3 376 0.06 -0.52 0.09 -0.63 -0.72 0.55 -0.50
1 1 832 7 830 0.06 0.96 0.92 -0.08 1.63 0.55 0.54
1 1 833 3 742 0.07 1.04 -0.78 -0.08 -0.72 2.88 1.10
1 1 834 8 186 0.03 -1.15 -0.78 -0.67 -0.72 -0.61 1.01
1 1 835 7 668 0.19 0.88 0.92 -0.22 0.06 -0.61 -1.25
1 1 836 3 278 0.06 -1.15 -0.78 -0.08 -0.72 0.55 1.18
1 1 837 3 436 0.04 -0.35 -0.78 -0.08 0.06 -0.61 0.12
1 1 838 1 303 0 -0.81 0.09 -0.68 -0.72 0.55 1.52
1 1 839 11 157 0.05 -1.07 -1.20 -0.70 -0.72 -0.61 0.90
1 1 840 5 950 0.04 1.48 2.67 0.82 1.63 -0.61 1.52
1 1 841 1 999 0 4.25 0.92 3.06 1.63 2.88 -0.44
1 1 842 5 965 0.03 1.03 -0.78 3.06 3.19 -0.61 -0.30
1 1 843 3 409 0.04 -0.76 0.09 -0.08 -0.72 0.55 -0.50
1 1 844 2 77 0.02 -0.93 -1.20 -1.18 -1.11 -0.61 -0.44
1 1 845 6 385 0.08 -0.21 1.23 -0.68 -0.72 -0.61 1.52
1 1 846 2 122 0.03 -1.23 -1.20 -0.68 -0.72 0.55 0.89
1 1 847 8 656 0.11 0.98 0.92 -0.12 0.06 -0.61 0.11
1 1 848 4 657 0.04 -0.13 2.67 0.33 -0.72 -0.61 -0.30
1 1 849 6 321 0.13 -1.33 -0.92 -0.08 0.06 -0.61 -0.92
1 1 850 5 17 0.08 -0.84 0.09 -1.29 -1.11 -0.61 -2.28
1 1 851 11 862 0.08 1.17 0.92 0.37 1.63 0.55 0.29
1 1 852 5 261 0.07 -0.87 0.09 -0.68 -0.72 -0.61 1.52
1 1 853 7 564 0.07 0.08 0.09 0.33 0.06 -0.61 0.32
1 1 854 5 634 0.07 -0.50 0.92 0.24 0.06 0.55 0.96
1 1 855 7 861 0.04 1.29 0.92 0.83 1.63 -0.61 0.76
1 1 856 1 407 0 0.03 0.09 -0.68 -0.72 0.55 -1.67
1 1 857 5 126 0.02 -1.12 -1.20 -1.12 -0.72 -0.61 0.06
1 1 858 5 127 0.02 -1.55 -0.78 -1.08 -0.72 -0.61 0.06
1 1 859 5 960 0.03 1.03 -0.78 3.06 3.19 -0.61 0.07
1 1 860 6 201 0.09 -0.74 -0.92 -0.55 -0.72 -0.61 -1.04
1 1 861 2 924 0.06 2.44 0.92 0.57 1.63 0.55 0.04
1 1 862 1 893 0 1.05 2.67 0.82 0.06 0.55 -0.30
1 1 863 4 86 0.02 -1.50 -0.78 -1.18 -1.11 -0.61 0.37
1 1 864 5 290 0.04 -0.39 -0.78 -0.65 -0.72 -0.61 0.47
1 1 865 4 606 0.07 0.34 0.92 -0.55 0.06 -0.61 -1.67
1 1 866 2 341 0.02 -0.63 0.92 -1.08 -0.72 -0.61 0.06
1 1 867 6 638 0.07 0.28 0.09 0.32 0.06 0.55 0.34
1 1 868 6 382 0.05 -0.46 0.92 -0.72 -0.72 -0.61 0.07
1 1 869 2 108 0.01 -0.98 -0.78 -1.12 -0.72 -0.61 1.52
1 1 870 5 481 0.05 -0.44 -0.78 -0.08 0.06 0.55 0.17
1 1 871 4 884 0.04 1.09 1.38 0.82 1.63 -0.61 1.01
1 1 872 3 111 0.02 -1.17 0.09 -1.18 -1.11 -0.61 1.01
1 1 873 6 699 0.08 0.65 0.92 0.38 0.06 0.55 0.44
1 1 874 3 561 0.02 -0.16 0.09 0.33 0.06 -0.61 1.52
1 1 875 7 488 0.05 -0.24 -0.78 0.33 0.06 -0.61 -0.36
1 1 876 3 414 0.07 -1.19 0.09 -0.08 0.06 -0.61 -0.92
1 1 877 3 630 0.06 0.24 0.92 0.20 0.06 -0.61 -0.39
1 1 878 4 18 0.04 -1.60 -1.20 -1.18 -1.11 -0.61 1.52
1 1 879 4 585 0.03 0.29 0.09 0.33 0.06 -0.61 0.53
1 1 880 5 450 0.04 -0.06 -1.20 -0.08 0.06 -0.61 0.08
1 1 881 3 705 0.01 -0.04 -0.78 0.82 1.63 -0.61 -0.30
1 1 882 10 539 0.14 0.12 1.01 0.16 -0.72 -0.61 0.57
1 1 883 3 497 0.06 0.74 -0.78 -0.53 0.06 -0.61 0.30
1 1 884 6 96 0.12 -1.75 -0.78 -0.61 -0.78 -0.61 -0.77
1 1 885 8 973 0.23 2.86 0.92 0.53 1.63 2.88 0.55
1 1 886 5 255 0.09 -0.51 -0.87 -0.68 -0.72 0.55 -1.08
1 1 887 6 314 0.04 -0.12 -0.78 -0.68 -0.72 -0.61 0.56
1 1 888 9 269 0.03 -0.57 -0.78 -0.68 -0.72 -0.61 0.40
1 1 889 5 71 0.04 -1.31 -1.20 -1.18 -1.11 -0.61 0.09
1 1 890 9 880 0.08 1.14 0.92 0.84 1.63 0.55 0.47
1 1 891 4 399 0.03 -0.76 0.09 0.31 -0.72 -0.61 0.40
1 1 892 2 994 0.08 0.16 -0.35 3.06 3.19 2.88 -1.30
1 1 893 3 769 0.07 0.14 -0.78 0.86 0.06 2.88 0.33
1 1 894 4 142 0.02 -0.68 -0.78 -1.18 -1.11 -0.61 0.87
1 1 895 2 242 0 -0.80 -0.78 -0.68 -0.72 -0.61 0.04
1 1 896 3 454 0.07 -0.64 0.09 -0.08 -0.72 2.88 -0.51
1 1 897 12 175 0.06 -1.05 -1.20 -0.72 -0.72 -0.61 0.50
1 1 898 3 536 0.08 -0.69 0.92 -0.08 0.06 -0.61 1.35
1 1 899 4 33 0.08 -1.69 -1.20 -1.04 -0.72 -0.61 1.52
1 1 900 5 332 0.08 -0.61 0.92 -1.18 -1.11 0.55 0.63
1 1 901 4 149 0.04 -1.39 -1.20 -0.61 -0.72 -0.61 0.36
1 1 902 3 625 0.04 0.60 0.09 0.03 0.06 -0.61 -1.67
1 1 903 8 512 0.06 -0.50 -0.78 0.82 0.06 -0.61 -1.30
1 1 904 2 601 0.01 0.17 0.09 0.33 0.06 -0.61 1.52
1 1 905 2 174 0.05 -1.10 -0.99 -1.18 0.06 -0.61 0.87
1 1 906 7 641 0.05 0.34 0.92 0.33 0.06 -0.61 0.39
1 1 907 1 432 0 0.21 -0.78 -0.53 0.06 -0.61 -1.08
1 1 908 7 891 0.1 1.20 0.92 0.77 1.63 0.55 -0.31
1 1 909 2 513 0.01 -0.88 0.92 -0.08 0.06 -0.61 0.07
1 1 910 5 869 0.08 1.48 0.92 -0.07 1.63 0.55 0.73
1 1 911 3 856 0.01 1.60 0.92 -0.08 0.06 0.55 -2.35
1 1 912 5 528 0.03 -0.24 -0.78 0.82 0.06 -0.61 0.07
1 1 913 5 411 0.04 0.36 0.09 -0.65 -0.72 -0.61 0.11
1 1 914 4 986 0.04 1.74 -0.78 3.06 3.19 0.55 -0.62
1 1 915 2 941 0.01 1.55 2.67 0.82 1.63 -0.61 1.01
1 1 916 3 831 0.08 1.20 0.92 -0.75 -0.72 2.88 0.54
1 1 917 4 82 0.03 -1.37 -1.20 -1.12 -0.72 -0.61 -0.44
1 1 918 3 500 0.05 -0.02 0.92 -0.08 -0.72 -0.61 -0.40
1 1 919 3 248 0.02 0.05 -1.20 -1.12 -0.72 -0.61 0.50
1 1 920 1 980 0 1.55 2.67 3.06 1.63 -0.61 -0.30
1 1 921 5 28 0.09 -0.71 -0.78 0.00 -0.72 2.88 -0.66
1 1 922 13 11 0.08 -1.30 -1.20 -0.94 -0.72 -0.61 -2.29
1 1 923 5 711 0.06 0.52 0.92 -0.50 0.06 0.55 -2.26
1 1 924 4 708 0.13 1.57 0.92 -0.05 -0.72 0.55 -0.28
1 1 925 4 469 0.07 -0.57 0.92 0.34 -0.72 -0.61 0.83
1 1 926 6 404 0.07 -0.92 0.92 -0.08 -0.72 -0.61 0.27
1 1 927 8 270 0.04 -0.62 -0.78 -0.62 -0.72 -0.61 0.06
1 1 928 5 687 0.11 0.47 0.92 0.16 0.06 0.55 -0.41
1 1 929 6 475 0.02 -0.35 -0.78 0.33 0.06 -0.61 1.01
1 1 930 3 535 0.07 -0.46 0.92 -0.23 0.06 -0.61 -0.06
1 1 931 4 307 0.07 -1.53 0.09 -0.08 -0.72 -0.61 -0.19
1 1 932 5 533 0.07 -0.34 -0.78 0.84 0.06 -0.61 1.32
1 1 933 3 16 0.1 -1.41 -1.06 -1.16 -0.98 0.55 1.52
1 1 934 2 795 0.04 2.18 0.92 -0.02 0.06 0.55 -0.11
1 1 935 3 132 0.02 -1.17 0.09 -1.18 -1.11 -0.61 0.77
1 1 936 5 197 0.04 -1.29 -0.78 -0.65 -0.72 -0.61 0.30
1 1 937 4 425 0.04 0.08 2.67 -0.68 -0.72 -0.61 1.52
1 1 938 3 775 0.06 1.11 0.92 0.82 0.06 0.55 -0.44
1 1 939 4 522 0.06 -0.27 -1.20 0.82 0.06 -0.61 0.47
1 1 940 3 732 0.09 1.50 0.92 -0.55 0.06 -0.61 -2.15
1 1 941 4 520 0.1 0.03 0.92 -0.69 -0.72 0.55 -1.23
1 1 942 12 81 0.07 -0.46 0.09 -0.96 -0.72 -0.61 -2.29
1 1 943 9 664 0.04 0.68 0.92 0.33 0.06 -0.61 0.81
1 1 944 4 829 0.06 1.22 0.09 0.82 1.63 -0.61 -0.44
1 1 945 2 716 0.01 0.12 -1.20 0.82 1.63 -0.61 0.37
1 1 946 3 923 0.13 2.37 0.92 -0.30 -0.72 2.88 0.08
1 1 947 4 36 0.01 -0.97 -0.78 -0.89 -0.72 -0.61 -2.36
1 1 948 5 811 0.03 1.03 0.09 0.83 1.63 -0.61 0.76
1 1 949 7 336 0.06 -0.29 -0.78 -0.65 -0.72 0.55 0.18
1 1 950 5 628 0.02 -0.57 0.92 0.84 0.06 -0.61 -0.84
1 1 951 5 171 0.03 -0.83 0.09 -1.18 -1.11 -0.61 0.89
1 1 952 5 67 0.06 -1.21 -1.20 -0.98 -0.72 -0.61 -1.08
1 1 953 9 484 0.03 -0.31 -0.78 0.33 0.06 -0.61 0.43
1 1 954 1 158 0 0.89 -0.78 -0.60 -0.72 -0.61 -2.37
1 1 955 4 542 0.1 0.12 -0.89 -0.08 0.06 0.55 0.07
1 1 956 6 375 0.09 -0.51 0.92 -0.59 -0.72 -0.61 -1.00
1 1 957 8 118 0.03 -0.99 -0.78 -1.18 -1.11 -0.61 0.88
1 1 958 6 271 0.1 -0.66 -0.92 -0.65 -0.72 0.55 -0.45
1 1 959 3 858 0.02 0.68 0.92 0.82 1.63 -0.61 1.52
1 1 960 3 221 0.03 -1.65 -0.78 -0.08 -0.72 -0.61 0.47
1 1 961 5 545 0.08 0.03 0.09 -0.62 0.06 0.55 0.30
1 1 962 4 423 0.07 -0.40 -0.78 -0.53 0.06 0.55 0.77
1 1 963 4 749 0.13 0.17 0.09 -0.08 0.06 2.88 -0.29
1 1 964 14 919 0.19 2.87 0.68 0.55 0.06 -0.61 -2.32
1 1 965 2 698 0.03 0.46 0.92 0.82 0.06 -0.61 -1.30
1 1 966 10 694 0.14 0.29 -0.78 -0.14 0.06 2.88 0.13
1 1 967 2 853 0.04 0.92 0.92 -0.69 -0.72 2.88 1.27
1 1 968 6 491 0.16 1.03 0.09 -0.47 -0.72 -0.61 0.45
1 1 969 5 692 0.12 0.18 1.01 0.17 0.06 0.55 1.52
1 1 970 6 110 0.07 -0.63 -0.78 -0.89 -0.72 -0.61 -1.67
1 1 971 5 946 0.04 0.98 2.67 1.73 1.63 -0.61 0.07
1 1 972 4 398 0.03 -0.23 0.92 -0.68 -0.72 -0.61 1.01
1 1 973 6 748 0.09 -0.22 -0.78 3.06 0.06 -0.61 -0.92
1 1 974 6 943 0.18 2.61 0.92 0.33 0.06 2.88 0.11
1 1 975 4 678 0.09 2.02 -0.78 0.46 0.06 -0.61 0.52
1 1 976 7 430 0.06 -0.01 0.92 -0.63 -0.72 -0.61 0.09
1 1 977 4 480 0.13 -0.53 0.09 -0.18 -0.72 2.88 0.27
1 1 978 4 373 0.1 -0.71 0.09 0.11 -0.72 -0.61 1.52
1 1 979 6 212 0.06 -1.23 -0.78 -0.69 -0.72 0.55 0.07
1 1 980 8 339 0.06 -0.05 -0.78 -0.56 0.06 -0.61 -2.30
1 1 981 3 736 0.02 0.35 -0.78 0.82 1.63 -0.61 1.01
1 1 982 10 357 0.11 -1.15 -0.95 -0.08 0.06 -0.61 0.54
1 1 983 4 764 0.14 0.12 -0.78 0.36 0.06 2.88 1.39
1 1 984 6 758 0.03 0.74 -0.78 0.84 1.63 -0.61 0.78
1 1 985 7 824 0.05 1.33 0.09 0.82 1.63 -0.61 0.57
1 1 986 4 208 0.02 -1.27 0.09 -1.08 -0.72 -0.61 0.06
1 1 987 11 313 0.03 -0.14 -0.78 -0.68 -0.72 -0.61 0.14
1 1 988 4 677 0.14 -0.34 0.92 -0.29 -0.72 2.88 0.08
1 1 989 8 729 0.18 0.16 2.67 0.27 0.06 -0.61 -0.43
1 1 990 6 263 0.05 -0.93 -0.78 -0.63 -0.72 0.55 0.39
1 1 991 2 148 0.04 -1.24 -0.78 -1.15 -0.72 -0.61 0.37
1 1 992 4 490 0.06 -0.47 0.92 0.31 -0.72 -0.61 -0.24
1 1 993 5 629 0.05 -0.41 0.92 0.81 0.06 -0.61 -0.49
1 1 994 6 12 0.17 -0.39 2.67 -1.02 -0.98 -0.61 1.52
1 1 995 20 492 0.09 0.17 0.09 -0.56 0.06 -0.61 -2.29
1 1 996 8 931 0.11 1.25 0.92 -0.53 0.06 2.88 -2.13
1 1 997 6 590 0.06 0.70 0.09 -0.08 0.06 -0.61 0.23
1 1 998 3 478 0.03 -0.35 -0.78 0.33 0.06 -0.61 0.18
1 1 999 3 762 0.07 1.05 0.92 -0.70 0.06 0.55 -2.28
1 1 1000 4 683 0.06 -0.60 0.92 0.87 0.06 0.55 -0.78
1 1 1001 4 47 0.02 -0.82 -0.78 -0.89 -0.72 -0.61 -2.25

Now let us understand what each column in the above summary table means:

All the columns after this will contain centroids for each cell. They can also be called a codebook, which represents a collection of all centroids or codewords.

7.0.2 Step 2: Data Projection

For more detailed information on Data Projection please refer to section 3 of this vignette.

lets view the projected 2D centroids after performing sammon’s projection on the compressed data recieved after performing vector quantization. For the shake of brevity we are displaying first six rows.


hvt_torus_coordinates <-hvt.results[[2]][[1]][["1"]]
centroids <<- list()
  coordinates_value <- lapply(1:length(hvt_torus_coordinates), function(x){
    centroids <-hvt_torus_coordinates[[x]]
    coordinates <- centroids$pt
  })
centroid_coordinates<<- do.call(rbind.data.frame, coordinates_value)  
colnames(centroid_coordinates) <- c("x","y")
centroid_coordinates <- centroid_coordinates %>% data.frame() %>% round(4)
Table(head(centroid_coordinates))
x y
10.1594 -3.0833
9.0798 -11.7843
7.6534 -3.3278
-5.6131 -6.2331
-0.5750 -8.9096
-4.6926 -24.8100

7.0.3 Step 3: Tessellation

For more detailed information on Data Compression please refer to section 4 of this vignette.

Now, we have obtained the centroid coordinates resulting from the application of Sammon’s projection.

For better visualisation, let’s plot the Voronoi tessellation for Map A using the plotHVT function.

# Voronoi tessellation plot for level one

 muHVT::plotHVT(hvt.results,
        line.width = c(0.2), 
        color.vec = c("#141B41"),
        centroid.size = 0.01,  #1.5
        maxDepth = 1)
Figure 9: The Voronoi Tessellation for layer 1 shown for the 1001 cells in the dataset ’computers’

Figure 9: The Voronoi Tessellation for layer 1 shown for the 1001 cells in the dataset ’computers’

Now let’s plot the Voronoi Tessellation with the heatmap overlaid for all the features in the computers dataset for better visualization.

The heatmaps displayed below provides a visual representation of the spatial characteristics of the computers data, allowing us to observe patterns and trends in the distribution of each of the features (price,speed,hd,ram,screen,ads). The sheer green shades highlight regions with higher values in each of the heatmaps, while the indigo shades indicate areas with the lowest values in each of the heatmaps. By analyzing these heatmaps, we can gain insights into the variations and relationships between each of these features within the computers data


muHVT::hvtHmap(
  hvt.results,
  trainComputers,
  child.level = 1,
  hmap.cols = "price",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.01,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 10: The Voronoi Tessellation with the heat map overlaid over the variable price in the ’computers’ dataset

Figure 10: The Voronoi Tessellation with the heat map overlaid over the variable price in the ’computers’ dataset


muHVT::hvtHmap(
  hvt.results,
  trainComputers,
  child.level = 1,
  hmap.cols = "hd",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.01,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 11: The Voronoi Tessellation with the heat map overlaid over the variable hd in the ’computers’ dataset

Figure 11: The Voronoi Tessellation with the heat map overlaid over the variable hd in the ’computers’ dataset

muHVT::hvtHmap(
  hvt.results,
  trainComputers,
  child.level = 1,
  hmap.cols = "ram",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.01,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 12: The Voronoi Tessellation with the heat map overlaid over the variable ram in the ’computers’ dataset

Figure 12: The Voronoi Tessellation with the heat map overlaid over the variable ram in the ’computers’ dataset

muHVT::hvtHmap(
  hvt.results,
  trainComputers,
  child.level = 1,
  hmap.cols = "screen",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.01,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 13: The Voronoi Tessellation with the heat map overlaid over the variable screen in the ’computers’ dataset

Figure 13: The Voronoi Tessellation with the heat map overlaid over the variable screen in the ’computers’ dataset


muHVT::hvtHmap(
  hvt.results,
  trainComputers,
  child.level = 1,
  hmap.cols = "ads",
  line.width = c(0.2),
  color.vec = c("#141B41"),
  palette.color = 6,
  centroid.size = 0.01,
  show.points = T,
  quant.error.hmap = 0.1,
  n_cells.hmap = 15
)
Figure 14: The Voronoi Tessellation with the heat map overlaid over the variable ads in the ’computers’ dataset

Figure 14: The Voronoi Tessellation with the heat map overlaid over the variable ads in the ’computers’ dataset

7.0.4 Step 4: Prediction(predictHVT)

For more detailed information on Data Compression please refer to section 5 of this vignette.

Now once we have built the model, let us try to predict using our test dataset which cell and which level each point belongs to.

predictHVT(data,
                  hvt.results,
                  hmap.cols = NULL,
                  child.level = 1,
                  ...)

The important parameters for the function predictHVT are as below

set.seed(240)
predictions <- muHVT::predictHVT(
  testComputers,
  hvt.results,
  hmap.cols = "Quant.Error",
  child.level = 1,
  line.width = c(1.2),
  color.vec = c("#141B41"),
  quant.error.hmap = 0.1,
  n_cells.hmap = 1001
)

Let’s see which cell and level each point belongs to. For the sake of brevity, we will only show the first 10 rows


predictions[["scoredPredictedData"]] %>% head(100) %>% 
  round(2) %>%
  as.data.frame() %>%
  Table(scroll = T, limit = 10)
Segment.Level Segment.Parent Segment.Child n Cell.ID Quant.Error price speed hd ram screen ads centroidRadius diff anomalyFlag
1 1 68 1 155 0.05 -1.23 -0.78 -0.68 -0.72 0.55 -0.84 0.08 0.03 0
1 1 15 1 987 0.03 1.38 0.09 3.06 3.19 0.55 -0.84 0.16 0.13 0
1 1 14 1 312 0.04 -0.80 0.09 -0.68 -0.72 -0.61 -0.84 0.11 0.07 0
1 1 697 1 663 0.04 0.23 2.67 -0.41 -0.72 -0.61 -0.84 0.00 -0.04 0
1 1 572 1 913 0.02 0.31 0.92 1.73 1.63 0.55 -0.84 0.04 0.02 0
1 1 89 1 848 0.14 -0.51 0.92 3.06 0.06 -0.61 -0.84 0.19 0.05 0
1 1 15 1 987 0.08 1.07 0.09 3.06 3.19 0.55 -0.84 0.16 0.07 0
1 1 828 1 521 0.11 -1.22 0.92 -0.08 0.06 -0.61 -0.84 0.04 -0.08 0
1 1 641 1 433 0.11 -0.93 0.92 -0.08 -0.72 -0.61 -0.84 0.06 -0.05 0
1 1 222 1 145 0.05 -1.12 -0.78 -0.68 -0.72 -0.61 -0.84 0.04 -0.01 0

We can see the predictions for the points in the table above.The centroid of the cell that the point is mapped to is the codeword (predictor) for that cell.

8 Executive Summary

9 Applications

  1. Pricing Segmentation - The package can be used to discover groups of similar customers based on the customer spend pattern and understand price sensitivity of customers

  2. Market Segmentation - The package can be helpful in market segmentation where we have to identify micro and macro segments. The method used in this package can do both kinds of segmentation in one go

  3. Anomaly Detection - This method can help us categorize system behavior over time and help us find anomaly when there are changes in the system. For e.g. Finding fraudulent claims in healthcare insurance

  4. The package can help us understand the underlying structure of the data. Suppose we want to analyze a curved surface such as sphere or vase, we can approximate it by a lot of small low-order polygons in the form of tessellations using this package

  5. In biology, Voronoi diagrams are used to model a number of different biological structures, including cells and bone microarchitecture

  6. Using the base idea of Systems Dynamics, these diagrams can also be used to depict customer state changes over a period of time

10 References

  1. Topology Preserving Maps : https://link.springer.com/chapter/10.1007/1-84628-118-0_7

  2. Vector Quantization : https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-450-principles-of-digital-communications-i-fall-2006/lecture-notes/book_3.pdf

  3. K-means : https://en.wikipedia.org/wiki/K-means_clustering

  4. Sammon’s Projection : http://en.wikipedia.org/wiki/Sammon_mapping

  5. Voronoi Tessellations : http://en.wikipedia.org/wiki/Centroidal_Voronoi_tessellation

  6. Embedding : https://en.wikipedia.org/wiki/Embedding